completed academic program assessment plans 20082009

295
Report: Assessment Plan Details for: Art Report Generated by TaskStream Workspace: Academic Program Assessment Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings Assessment Plan Template: Academic Program Assessment Report Generated: Friday, August 06, 2010 Measures and Findings Art Outcome Set Outcomes Resist Grade Inflation Mapped to: No Mapping Measures & Findings 100- and 200-level course grades Program level; Direct - Other Details/Description: The department will strive to meet a target A’s and B’s, in 100 and 200 level foundation studio courses based on 14-day enrollment. Target: No more than 30% A's No more than 40% B's Implementation Plan (timeline): Annual: September Key/Responsible Personnel: Foundation faculty Department Head Reports Supporting Attachments: Findings for 100- and 200-level course grades Summary of Findings: Fall 08 foundation studios awarded 14.6 A’s and 46.5% B’s (Total A’s and B’s 61.1%). Spring 09 foundation studios awarded 18.9% A’s and 58% B’s (Total A’s and B’s 76.9%). Percentages are based on 14-day enrollment figures. Target Achievement: Exceeded Recommendations : The department will continue to review all foundation coursework to insure both consistency and rigor throughout the foundations curriculum. Notes : Substantiating Evidence: Formal Competency Mapped to: No Mapping Measures & Findings Senior BFA Thesis Outcomes Program level; Direct - Portfolio Details/Description: In the capstone BFA Senior Thesis Exhibition requirement, student outcomes for formal competency are assessed by departmental faculty. Target: Minimum mean average of 80% of "Satisfactory" or better in formal competency. Implementation Plan (timeline): Annual: April Findings for Senior BFA Thesis Outcomes Summary of Findings: 92% (mean) of rankings were “Satisfactory” (or higher) in formal competency. Target Achievement: Exceeded Recommendations : None Notes : Substantiating Evidence: Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy... 1 of 5 8/6/2010 11:59 AM

Upload: supriyanto-bin-praptoutomo

Post on 16-Oct-2014

21 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Art

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Art Outcome Set

Outcomes

Resist Grade Inflation

Mapped to:No Mapping

Measures & Findings

100- and 200-level course gradesProgram level; Direct - Other

Details/Description: The department will strive to meeta target A’s and B’s, in 100 and 200 level foundationstudio courses based on 14-day enrollment.

Target: No more than 30% A'sNo more than 40% B's

Implementation Plan (timeline): Annual: September

Key/Responsible Personnel: Foundation facultyDepartment Head Reports

Supporting Attachments:

Findings for 100- and 200-level course grades

Summary of Findings: Fall 08 foundation studiosawarded 14.6 A’s and 46.5% B’s (Total A’s and B’s61.1%).

Spring 09 foundation studios awarded 18.9% A’s and58% B’s (Total A’s and B’s 76.9%).

Percentages are based on 14-day enrollment figures.

Target Achievement: Exceeded

Recommendations : The department will continue toreview all foundation coursework to insure bothconsistency and rigor throughout the foundationscurriculum.

Notes :

Substantiating Evidence:

Formal Competency

Mapped to:No Mapping

Measures & Findings

Senior BFA Thesis OutcomesProgram level; Direct - Portfolio

Details/Description: In the capstone BFA Senior ThesisExhibition requirement, student outcomes for formalcompetency are assessed by departmental faculty.

Target: Minimum mean average of 80% of "Satisfactory"or better in formal competency.

Implementation Plan (timeline): Annual: April

Findings for Senior BFA Thesis Outcomes

Summary of Findings: 92% (mean) of rankings were“Satisfactory” (or higher) in formal competency.

Target Achievement: Exceeded

Recommendations : None

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 5 8/6/2010 11:59 AM

Page 2: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: Departmental FacultyDepartment Head Reports

Supporting Attachments:

Sophomore Review BFAProgram level; Direct - Portfolio

Details/Description: Students admitted into the BFAstudio programs will have demonstrated an appropriatelevel of formal competency as assessed in the Soph.Port Review.

Target: 80% score B or above in "Formal Competency"in formal assessment by faculty jury.

Implementation Plan (timeline): Annual: April

Key/Responsible Personnel: UTC Department FacultyDepartment Head Reports

Supporting Attachments:

Findings for Sophomore Review BFA

Summary of Findings: 23 students (of 25 total admittedinto the BFA program) received a “C or above” evaluationin formal competency, for a total of 92%.

Target Achievement: Exceeded

Recommendations : None

Notes :

Substantiating Evidence:

Sophomore Review BSProgram level; Direct - Portfolio

Details/Description: Students admitted into the BSprogram in Art Education will have demonstrated anappropriate level of formal competency as assessed inthe Soph. Port Review.

Target: 80% score B or above in "Formal Competency"in formal assessment by faculty jury.

Implementation Plan (timeline): Annual: April

Key/Responsible Personnel: UTC Department FacultyDepartment Head Reports

Supporting Attachments:

Findings for Sophomore Review BS

Summary of Findings: 2 students (of 3 total admittedinto the BS program) received a “C or above” evaluationin formal competency, for a total of 66%.

Target Achievement: Not Met

Recommendations : The total number of studentsseeking entry into the BS may not provide a largeenough pool for this assessment measure to bemeaningful.

Notes :

Substantiating Evidence:

Oral and Written Competency

Mapped to:No Mapping

Measures & Findings

Senior BFA Thesis OutcomesProgram level; Direct - Portfolio

Details/Description: In the capstone BFA Senior ThesisExhibition requirement, student outcomes for writtencompetency are assessed by departmental faculty.

Target: Minimum mean average of 80% of "Satisfactory"or better.

Implementation Plan (timeline): Annual: April

Key/Responsible Personnel: Departmental FacultyDepartment Head Reports

Supporting Attachments:

Findings for Senior BFA Thesis Outcomes

Summary of Findings: 92% (mean) of rankings were“Satisfactory” or higher in written competency.

Target Achievement: Exceeded

Recommendations : None

Notes :

Substantiating Evidence:

Sophomore Review BFAProgram level; Direct - Portfolio

Details/Description: Students admitted into the BFAstudio programs will have demonstrated an appropriate

Findings for Sophomore Review BFA

Summary of Findings: 22 students (of 27 total admitted

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 5 8/6/2010 11:59 AM

Page 3: Completed Academic Program Assessment Plans 20082009

level of oral competency as assessed in the Soph. PortReview.

Target: 80% score B or above in "Oral Competency" informal assessment by faculty jury.

Implementation Plan (timeline): Annual: April

Key/Responsible Personnel: UTC Department FacultyDepartment Head Reports

Supporting Attachments:

into the BFA programs) received a “C or above”evaluation in oral competency, for a total of 81%.

Target Achievement: Exceeded

Recommendations : None

Notes :

Substantiating Evidence:

Sophomore Review BSProgram level; Direct - Portfolio

Details/Description: Students admitted into the BSprogram in Art Education will have demonstrated anappropriate level of oral competency as assessed in theSoph. Port Review.

Target: 80% score B or above in "Oral Competency" informal assessment by faculty jury.

Implementation Plan (timeline): Annual: April

Key/Responsible Personnel: UTC Department FacultyDepartment Head Reports

Supporting Attachments:

Findings for Sophomore Review BS

Summary of Findings: 2 students (of 4 total admittedinto the BFA programs) received a “C or above”evaluation in oral competency, for a total of 50%.

Target Achievement: Not Met

Recommendations : The total number of studentsseeking entry into the BS may not provide a largeenough pool for this assessment measure to bemeaningful.

Notes :

Substantiating Evidence:

Student Preparedness

Mapped to:No Mapping

Measures & Findings

Professional Objectives: ActualProgram level; Direct - Other

Details/Description: Measure of percentage of students(with intention) indicate actual application and/oracceptance into one or more graduate programs and/orindicate employment in an arts related profession.

Target: Within one year, 80% of students withintention indicate application and/or acceptance into oneor more graduate programs and/or indicate employmentin an arts related profession.

Implementation Plan (timeline): Annual: Report inSeptember on May graduates

Key/Responsible Personnel: Department FacultyDepartment Head reports

Supporting Attachments:

Findings for Professional Objectives: Actual

Summary of Findings: Of 17 graduating majorsindicating intention to do so, 15 maintain a professionalstudio and or are employed full-time in an art/designrelated field, 0 have applied for and been accepted to agraduate program, 1 is teaching full-time in area primaryor secondary art programs, accounting for 94% of ourgraduating majors.

Target Achievement: Exceeded

Recommendations : Revise assessment to reportstudent outcomes after three years to better reflectentry into graduate-level programs and residencies.

Notes :

Substantiating Evidence:

Professional Objectives: StatedProgram level; Direct - Other

Details/Description: The department will look at dataprovided by its senior exit survey to gauge studentobjectives to engage professionally in the art/designfield and/or to make application for entry into agraduate program upon graduation.

Target: 70% of students responding indicate an

Findings for Professional Objectives: Stated

Summary of Findings: Of 18 graduating majorsresponding to the departmental exit survey, 17 (94%)indicate an intention to a) enter into an arts relatedprofession and/or b) make application for entry intoMFA/MBA programs upon graduation.

Target Achievement: Exceeded

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 5 8/6/2010 11:59 AM

Page 4: Completed Academic Program Assessment Plans 20082009

intention to enter into an arts related profession and/ormake application for entry into a graduate program upongraduation.

Implementation Plan (timeline): Annual: April

Key/Responsible Personnel: Department FacultyDepartment Head reports

Supporting Attachments:

Recommendations : None

Notes : Of 22 students completing the survey, 4 remainenrolled in coursework unrelated to their area ofconcentration. Those responses are not counted here.

Substantiating Evidence:

Positive Educational Experience

Mapped to:No Mapping

Measures & Findings

Departmental QualityProgram level; Direct - Other

Details/Description: The department will look at dataprovided by its senior exit survey to assess studentperceptions regarding the quality of education receivedin the UTC Art Department.

Target: 75% of students responding rank “4” or aboveon a “1” to “5” ranking

Implementation Plan (timeline): Annually: April

Key/Responsible Personnel: Department FacultyDepartment Head Reports

Supporting Attachments:

Findings for Departmental Quality

Summary of Findings: 100% of students responding tothe department's exit survey ranked departmentalquality as 4 or above.

Target Achievement: Exceeded

Recommendations : None

Notes :

Substantiating Evidence:

Program Quality BAProgram level; Direct - Other

Details/Description: The department will look at dataprovided by its senior exit survey to assess studentperceptions regarding the quality of education receivedin the specific degree program in the UTC ArtDepartment.

Target: 75% of students responding rank “4” or aboveon a “1” to “5” ranking

Implementation Plan (timeline): Annually: April

Key/Responsible Personnel: Department FacultyDepartment Head Reports

Supporting Attachments:

Findings for Program Quality BA

Summary of Findings: None

Target Achievement: Not Met

Recommendations : Develop mechanisms to insure thatBA majors complete and return the department's exitsurvey.

Notes :

Substantiating Evidence:

Program Quality BFAProgram level; Direct - Other

Details/Description: The department will look at dataprovided by its senior exit survey to assess studentperceptions regarding the quality of education receivedin the specific degree program in the UTC ArtDepartment.

Target: 75% of students responding rank “4” or aboveon a “1” to “5” ranking

Implementation Plan (timeline): Annually: April

Key/Responsible Personnel: Department Faculty

Findings for Program Quality BFA

Summary of Findings: 100% of students responding tothe department's exit survey ranked programmaticquality as 4 or above.

Target Achievement: Exceeded

Recommendations : None

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 5 8/6/2010 11:59 AM

Page 5: Completed Academic Program Assessment Plans 20082009

Department Head Reports

Supporting Attachments:

Program Quality BSProgram level; Direct - Other

Details/Description: The department will look at dataprovided by its senior exit survey to assess studentperceptions regarding the quality of education receivedin the specific degree program in the UTC ArtDepartment.

Target: 75% of students responding rank “4” or aboveon a “1” to “5” ranking

Implementation Plan (timeline): Annually: April

Key/Responsible Personnel: Department FacultyDepartment Head Reports

Supporting Attachments:

Findings for Program Quality BS

Summary of Findings: 100% of students responding tothe department's exit survey ranked programmaticquality as 4 or above.

Target Achievement: Exceeded

Recommendations : None

Notes :

Substantiating Evidence:

B.S. Praxis and Practicum

Mapped to:No Mapping

Measures & Findings

Praxis and PracticumProgram level; Direct - Other

Details/Description: Student preparedness to meetprofessional standards for K-12 teaching

Target: At least 80% of students will pass the Praxis IIExam on the first attempt and 100% will pass thepracticum.

Implementation Plan (timeline): Annual: May

Key/Responsible Personnel: Professor Anne LindseyDepartment Head Reports

Supporting Attachments:

Findings for Praxis and Practicum

Summary of Findings: 100% of students passed thePraxis II Exam; 100% the practicum.

Target Achievement: Exceeded

Recommendations : None

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

5 of 5 8/6/2010 11:59 AM

Page 6: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Chemistry

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Chemistry Outcome Set

Outcomes

Fundamentals of chemistry

Mapped to:USA- ACS- Guidelines for Bachelor's Programs:5.2 Introductory or General Chemistry., 5.3Foundation Course Work., 5.4 In-Depth CourseWork., 5.5 Laboratory Experience., 5.7 CognateCourses.

Measures & Findings

Meeting ACS standardsCourse level; Direct - Exam

Details/Description: 1a. The average scores ofchemistry students on ACS standardized exams inupper-level courses will be at or above the 50thpercentile.

Target: at or above the 50th percentile

Implementation Plan (timeline): Every semester thatthe course is taught

Key/Responsible Personnel: faculty

Supporting Attachments:

Findings for Meeting ACS standards

Summary of Findings: Students attained the 91stpercentile on the ACS Inorganic exam and the 43rdpercentile on the Physical Chemistry I exam. Studentsalso took the ACS exam in P Chem II, but norming dataare not yet available for this new exam.

Target Achievement: Exceeded

Recommendations : Inorganic results are excellent.Analysis of the P Chem I exam indicates no pattern ofmissed questions. A and B students did very well onexam, C students did extremely poorly. We will placeadditional emphasis on pchem homework assignments.

Notes :

Substantiating Evidence:

Meeting ACS standardsProgram level; Indirect - Survey

Details/Description: 1b. The average response from ourmajors to survey questions about receiving a soundeducation in chemistry and the attainment of ACSstandards will be 4.5 or higher on a scale of 1 to 5.

Target: 4.5 or higher on a scale of 1 to 5

Implementation Plan (timeline): annual

Key/Responsible Personnel: 486 instructor

Supporting Attachments:

Findings for Meeting ACS standards

Summary of Findings: Survey response was 4.9, upfrom 4.7 last year.

Target Achievement: Exceeded

Recommendations : 11 out of 12 responses were verypositive. The single negative response dealt with ourmove to temporary quarters.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 4 8/6/2010 12:02 PM

Page 7: Completed Academic Program Assessment Plans 20082009

Cognate areas

Mapped to:USA- ACS- Guidelines for Bachelor's Programs:5.7 Cognate Courses.

Measures & Findings

Cognate measuresInstitution level; Direct - Exam

Details/Description: 2a. The average score of chemistrystudents on the English, Math, and Science areas on theACT-CAAP Exam will be at or above the 50th percentile.

Target: at or above the 50th percentile.

Implementation Plan (timeline): annually

Key/Responsible Personnel: faculty

Supporting Attachments:

Findings for Cognate measures

Summary of Findings: The ACT-CAAP national percentilesubscores were: Critical Thinking Skills 81%, Reading96%, Math 89%, and Science Reasoning 60%.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Cognate measuresInstitution level; Indirect - Survey

Details/Description: 2b. The average response from ourmajors to survey questions about their proficiency incognate areas will be 4.5 or higher on a scale of 1 to 5.

Target: 4.5 or higher on a scale of 1 to 5

Implementation Plan (timeline): annually

Key/Responsible Personnel: faculty

Supporting Attachments:

Findings for Cognate measures

Summary of Findings: Survey response on writingproficiency was 4.6, up from 4.5 last year. Surveyresponse on oral communication was 4.5, up from 4.4last year.

Target Achievement: Met

Recommendations : All written responses were veryfavorable. We will look for topics we can abbreviate oreliminate in order to include advanced material in ourthree communication courses.

Notes :

Substantiating Evidence:

Cognate measuresProgram level; Indirect - Other

Details/Description: 2c. The instructors in Chem 286and Chem 486 will evaluate students' speaking skillsusing an available objective measure

Target: students will recognize their own improvementin public speaking skills

Implementation Plan (timeline): each semester

Key/Responsible Personnel: 286 & 486 faculty

Supporting Attachments:

Findings for Cognate measures

Summary of Findings: 19 out of 21 students inChemistry 286 indicated more confidence in preparingand giving a scientific presentation.

In open-ended surveys, students in both classesindicated the classes were very useful and improvedtheir oral communication skills.

Target Achievement: Met

Recommendations : We will compare 286 and 486measures to look for unnecessary redundancy intargeted speaking skills or for areas needing moreimprovement.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 4 8/6/2010 12:02 PM

Page 8: Completed Academic Program Assessment Plans 20082009

Research

Mapped to:USA- ACS- Guidelines for Bachelor's Programs:5.5 Laboratory Experience.

Measures & Findings

ResearchProgram level; Direct - Other

Details/Description: 3a. We will assess the quality ofstudent research by monitoring the success rate ofpeer-reviewed student research submissions: conferencepresentations, honors theses, and journal articles.

Target: At least 50% acceptance and conferenceparticipation by students doing research

Implementation Plan (timeline): annually

Key/Responsible Personnel: faculty

Supporting Attachments:

Findings for Research

Summary of Findings: 20 out of 20 peer reviewedstudent abstract submissions to conferences wereaccepted. An additional 5 out of 5 peer reviewedfaculty/student co-authored abstract submissions toconferences were accepted.

7 out of 8 article submissions were accepted, and 1submission is in rewrite.

Target Achievement: Met

Recommendations : We will examine ways toaccommodate additional students in faculty-directedresearch.

We will examine closely plans to return to our renovatedscience building so that faculty-student researchactivities are not negatively impacted.

Notes :

Substantiating Evidence:

ResearchProgram level; Direct - Student Artifact

Details/Description: 3b. Students participating inundergraduate research for ACS certificationrequirements must prepare a well-written,comprehensive, and well-documented research reportincluding safety considerations.

Target: Papers will be collected from all of thesestudents.

Implementation Plan (timeline): annually

Key/Responsible Personnel: faculty

Supporting Attachments:

Findings for Research

Summary of Findings: Each student prepared a paperapproved by his/her research advisor.

Target Achievement: Met

Recommendations : Continue monitoring student workin research.

Notes :

Substantiating Evidence:

Research methods

Mapped to:USA- ACS- Guidelines for Bachelor's Programs:4.2 Instrumentation., 4.3 Computational Capabilitiesand Software., 4.4 Chemical Information Resources.,4.5 Chemical Safety Resources., 7.4 CommunicationSkills., 7.5 Team Skills.

Measures & Findings

Research methodsProgram level; Indirect - Survey

Details/Description: 4a. The average response from ourstudents to survey questions about acquiring theseessential skills will be 4.5 or higher on a scale of 1 to 5.

Target: 4.5 or higher on a scale of 1 to 5

Implementation Plan (timeline): annually

Key/Responsible Personnel: faculty

Findings for Research methods

Summary of Findings: Student response on acquiringthese essential skills was 4.9, up from 4.4 last year.

Target Achievement: Exceeded

Recommendations : We will examine requirements forthese skills in our various lab courses in order to reduceredundancy and possibly introduce advancedrequirements.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 4 8/6/2010 12:02 PM

Page 9: Completed Academic Program Assessment Plans 20082009

Supporting Attachments: Notes :

Substantiating Evidence:

Career outcome

Mapped to:USA- ACS- Guidelines for Bachelor's Programs:7.1 Problem-Solving Skills., 7.2 Chemical LiteratureSkills., 7.3 Laboratory Safety Skills., 7.4Communication Skills., 7.5 Team Skills., 7.6 Ethics.,7.8 Student Mentoring and Advising.

Measures & Findings

CareersProgram level; Direct - Other

Details/Description: 5a. 75% or more of our graduatingchemistry students applying to graduate or health-profession programs will be admitted.

Target: 75% acceptance

Implementation Plan (timeline): annually

Key/Responsible Personnel: faculty

Supporting Attachments:

Findings for Careers

Summary of Findings: 6 out of 6 applicants to graduateschool were accepted. 7 out of 9 applicants to medicalschool were accepted. 2 out of 3 applicants to dentalschool were accepted. 9 applicants to pharmacy schoolwere accepted.

Target Achievement: Met

Recommendations : We will review our current advisingpractices in order to guarantee that all chemistry majorsreceive effective advising from the very onset of theirstudies.

Notes :

Substantiating Evidence:

CareersProgram level; Indirect - Survey

Details/Description: 5b. Feedback will be solicited fromour former students in graduate and health-professionprograms on how well they were prepared.

Target: 75% will indicate they were well prepared

Implementation Plan (timeline): every 5 years

Key/Responsible Personnel: faculty

Supporting Attachments:

Findings for Careers

Summary of Findings: 11 out of 11 2008-09 graduatesgave very positive comments on their UTC chemistryexperience.

Target Achievement: Met

Recommendations : Investigate ways to obtainfeedback from more graduates within the last ten years.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 4 8/6/2010 12:02 PM

Page 10: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Criminal Justice: BS

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

1. Master Core Material

Outcomes

1.1 Mastery of Core Material

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 3.10General Ed & Major Field Tests

Measures & Findings

ETS TestingProgram level; Direct - Exam

Details/Description: This is a survey administered bythe faculty member teaching Senior Seminar. Tests arepurchased from ETS by the Office of Planning,Evaluation, and Institutional Research.

Target: At least 75% of graduating seniors will score ator above the 75th percentile on the ETS Criminal JusticeExam which will be administered in the Senior Seminar.

Implementation Plan (timeline): FAll 2008-Spring 2009

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchEigenbergThompsonAll faculty teaching core courses

Supporting Attachments:

Findings for ETS Testing

Summary of Findings: 75% of students taking the ETSexit exam scored at or above the 75th percentile (79%for All 2008 and 72% for Spring 2009)

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

1.2 Comprehensive Exposure to CoreMaterial

Mapped to:No Mapping

Measures & Findings

Exposure to Doctoral Level Faculty in CoreCoursesProgram level; Indirect - Other

Details/Description: This will be measured by usingteaching roster assignments.

Target: 75% of core courses above the 100 level will betaught by tenured or tenure track faculty.

Implementation Plan (timeline): Fall 2008-Spring 2009

Key/Responsible Personnel: Eigenberg

Findings for Exposure to Doctoral Level Faculty inCore Courses

Summary of Findings: 64% (Fall 2008) and 56% (Spring2009) of core courses were taught by tenure trackfaculty.

Target Achievement: Not Met

Recommendations : Fill tenure track position.

Notes : We had a tenure track position filled by a one

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 15 8/6/2010 12:05 PM

Page 11: Completed Academic Program Assessment Plans 20082009

Supporting Attachments: year instructor. 91% (Fall 2008) and 81% (Spring 2009)of core courses were taught by full time facultymembers.

Substantiating Evidence:

2. Student Assessment of ProgramQuality

Outcomes

2.1 Student Satisfaction with OverallQuality of Program

Mapped to:No Mapping

Measures & Findings

Student Satisfaction Overall Quality ofProgramProgram level; Indirect - Survey

Details/Description: This survey is administered inBlackboard by the faculty member teaching CRMJ 485(Senior Seminar).

Target: Graduates will have a mean score of 2.0 orlower on the Senior Exit Survey on these items.

Implementation Plan (timeline): Fall 2008-Spring 2009

Key/Responsible Personnel: EigenbergThompsonAll program faculty

Supporting Attachments:

Findings for Student Satisfaction Overall Quality ofProgram

Summary of Findings: No findings are available. Therewas a misunderstanding about how the survey was to beadministered.

Target Achievement: Not Met

Recommendations : Ensure survey is administered inSenior Seminar (CRMJ 485) next year.

Notes :

Substantiating Evidence:

2.2 Overall Satisfaction

Mapped to:No Mapping

Measures & Findings

Overall Student SatisfactionProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: Criminal Justice majors will achieve a meanscore of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s Enrolled StudentSurvey that assesses the entire educational experience.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program facultyEigenberg

Supporting Attachments:

Findings for Overall Student Satisfaction

Summary of Findings: Criminal Justice majors had amean of 3.14 on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 15 8/6/2010 12:05 PM

Page 12: Completed Academic Program Assessment Plans 20082009

3. Mastery of Practical Skills

Outcomes

3.1 Practical Skills required for Career

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 1.1Service Learning, Measurable Outcome 1.3 ExperientialLearning Opportunities, Measurable Outcome 2.1Distinctive Experience Outside Class, MeasurableOutcome 2.2 Student Satisfaction, MeasurableOutcome 3.9 Student Satisfaction

Measures & Findings

Practical Skills Gained in Internship ProgramCourse level; Direct - Student Artifact

Details/Description: This item is measured by studentrecords kept by the Internship Coordinator.

Target: At least 75% of the internships will result in aPassing grade .

Implementation Plan (timeline): Fall 2008, Spring2009, Summer 2009

Key/Responsible Personnel: GarlandAll program faculty

Supporting Attachments:

Findings for Practical Skills Gained in InternshipProgram

Summary of Findings: 100% of students enrolled ininternships resulted in a passing grade.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Practical Skills Gained in University and MajorCoursesProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: Criminal Justice majors will achieve a meanscore of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s National Surveyof Student Engagement which assesses whether astudent’s experience at UTC led them to acquire job orwork related knowledge and skills.

Implementation Plan (timeline): Fall 2008, Spring 2009

Key/Responsible Personnel: All program facultyEigenberg

Supporting Attachments:

Findings for Practical Skills Gained in Universityand Major Courses

Summary of Findings: Criminal Justice majors had amean of 2.71 on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

3.2 Student Internships

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 1.1Service Learning, Measurable Outcome 1.3 ExperientialLearning Opportunities, Measurable Outcome 2.1Distinctive Experience Outside Class, MeasurableOutcome 2.2 Student Satisfaction

Measures & Findings

Demonstrate Mastry Practical SkillsCourse level; Direct - Student Artifact

Details/Description: This item is measured by studentrecords kept by the Internship Coordinator.

Target: At least 75% of majors completing aninternship will be certified by the internship coordinatoras having satisfactory work performance.

Implementation Plan (timeline): Fall 2008, Spring

Findings for Demonstrate Mastry Practical Skills

Summary of Findings: 100% of majors completing aninternship were certified by the internship coordinator ashaving satisfactory work performance.

Target Achievement: Exceeded

Recommendations :

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 15 8/6/2010 12:05 PM

Page 13: Completed Academic Program Assessment Plans 20082009

2009, Summer 2009

Key/Responsible Personnel: Garland

Supporting Attachments:

Substantiating Evidence:

3.4 Service Learning

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 1.1Service Learning, Measurable Outcome 1.3 ExperientialLearning Opportunities, Measurable Outcome 2.1Distinctive Experience Outside Class

Measures & Findings

Participation in Service LearningProgram level; Indirect - Other

Details/Description: This item is measured byenrollment data and course syllabi for CRMJ 485 and theinternship course(s).

Target: 100% of students in the program willparticipate in some form of service learning as evidencedby curriculum requirements related to the Senior Seminarcourse (485) and the internship program.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: EigenbergThompsonGarland

Supporting Attachments:

Findings for Participation in Service Learning

Summary of Findings: 100% of students in the programparticipated in some form of service learning.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

4. Mastery of Writing Skills

Outcomes

4.1 Student Assessment of WritingSkills

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 2.2Student Satisfaction

Measures & Findings

Mastry of Writing SkillsProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Criminal Justice program will achieve amean score of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s Enrolled StudentSurvey that assesses whether a student’s experience atUTC added to their ability to write clearly andeffectively.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchEigenbergAll program faculty

Supporting Attachments:

Findings for Mastry of Writing Skills

Summary of Findings: Criminal Justice majors had amean of 3.00 on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:Faculty Report Exposure to Vavrious

Assignments (Microsoft Word)

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 15 8/6/2010 12:05 PM

Page 14: Completed Academic Program Assessment Plans 20082009

Student Satisfaction Writing SkillsProgram level; Indirect - Survey

Details/Description: This survey is administered inBlackboard by the faculty member teaching CRMJ 485(Senior Seminar).

Target: 75% of students completing the Senior Surveywill agree that they have good writing skills.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: EigenbergThompson

Supporting Attachments:

Findings for Student Satisfaction Writing Skills

Summary of Findings: No findings are available. Therewas a misunderstanding about how the survey was to beadministered.

Target Achievement: Not Met

Recommendations : Ensure survey is administered inSenior Seminar (CRMJ 485) next year.

Notes :

Substantiating Evidence:

4.2 Exposure to Writing Assignments

Mapped to:No Mapping

Measures & Findings

Exposure to Writing AssignmentsDirect - Other

Details/Description: This data is obtained from coursesyllabi and survey of faculty about their courserequirements.

Target: 100% of course at the 300/400 level will have asignificant writing assignment.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: EigenbergAll program faculty teaching at the 300/400 level

Supporting Attachments:

Findings for Exposure to Writing Assignments

Summary of Findings: 100% of course at the 300/400level had a significant writing assignment.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

5. Mastery of Computer Skills

Outcomes

5.1 Student Assessment of ComputerSkills

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 2.2Student Satisfaction

Measures & Findings

Student Reports about Computer SkillsProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Criminal Justice program will achieve amean score of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s Enrolled StudentSurvey that assesses whether students used anelectronic medium to discuss or complete anassignment; worked on an assignment where they useda computer; and contributed to their ability to use

Findings for Student Reports about ComputerSkills

Summary of Findings: Criminal Justice majors had amean of 2.43, 3.43, 2.57 on these items.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

5 of 15 8/6/2010 12:05 PM

Page 15: Completed Academic Program Assessment Plans 20082009

computing and informational technology.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchEigenbergAll program Faculty

Supporting Attachments:

Student Reports about Computer SkillsProgram level; Indirect - Survey

Details/Description: This survey is administered inBlackboard by the faculty member teaching CRMJ 485(Senior Seminar).

Target: 75% of students completing the Senior Surveywill agree that they have good computer skills.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: EigenbergThompsonAll program faculty

Supporting Attachments:

Findings for Student Reports about ComputerSkills

Summary of Findings: No findings are available. Therewas a misunderstanding about how the survey was to beadministered.

Target Achievement: Not Met

Recommendations : Ensure survey is administered inSenior Seminar (CRMJ 485) next year.

Notes :

Substantiating Evidence:

5.2 Exposure to Computer Assignments

Mapped to:No Mapping

Measures & Findings

Exposure to Computer AssignmentsProgram level; Indirect - Other

Details/Description: This data is obtained from coursesyllabi and survey of faculty about their courserequirements.

Target: At least 50% of the criminal justice coursestaught will require students to conduct at leastassignment involving computer. technology.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: EigenbergAll program faculty

Supporting Attachments:

Findings for Exposure to Computer Assignments

Summary of Findings: 87% of the criminal justicecourses taught required at least one computerassignment.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Exposure to Computer AssignmentsProgram level; Indirect - Other

Details/Description: This data is obtained from coursesyllabi and survey of faculty about their courserequirements.

Target: 50% of full time faculty will require students touse at least some component of Blackboard in theircourses.

Implementation Plan (timeline): Fall 2008 and Spring

Findings for Exposure to Computer Assignments

Summary of Findings: 100% of full time faculty used atleast some component of Blackboard in their courses.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

6 of 15 8/6/2010 12:05 PM

Page 16: Completed Academic Program Assessment Plans 20082009

2009

Key/Responsible Personnel: EigenbergAll full time program faculty

Supporting Attachments:

Exposure to Computer AssignmentsIndirect - Other

Details/Description: This data is obtained from coursesyllabi and survey of adjunct faculty about their courserequirements.

Target: 50% of adjunct faculty will require students touse at least some component of Blackboard in theircourses.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: EigenbergAll adjunct faculty

Supporting Attachments:

Findings for Exposure to Computer Assignments

Summary of Findings: 100% of adjunct faculty used atleast some component of Blackboard in their courses.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

5.3 Expand use of technology inteaching and learning.

Mapped to:Strategic Initiative: Enabling Partnerships:Measurable Outcome 4.2 Expand Library ElectronicResources

Measures & Findings

Facuty Development Computer TechnologyProgram level; Direct - Other

Details/Description: This data is obtained from annualEDO reports completed by faculty and the DepartmentHead.

Target: 100% of full time faculty will attend at leastone faculty development workshop (internal or external)relating to technology so that they may better integrateit into their teaching and student assignments.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: EigenbergAll full time program faculty

Supporting Attachments:

Findings for Facuty Development ComputerTechnology

Summary of Findings: 86% of faculty (6 of 7) attendedat least one faculty development program associted withtechnology.

Target Achievement: Not Met

Recommendations : Continue to monitor EDOs andstress this goal.

Notes :

Substantiating Evidence:

6. Mastery of Oral CommunicationSkills

Outcomes

6.1 Student Assessment of OralCommunication Skills

Mapped to:

Measures & Findings

Student Reports of Oral Communication SkillsProgram level; Indirect - Survey

Findings for Student Reports of OralCommunication Skills

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

7 of 15 8/6/2010 12:05 PM

Page 17: Completed Academic Program Assessment Plans 20082009

Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 3.11General Education Evaluation

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Criminal Justice program will achieve amean score of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s Enrolled StudentSurvey that assesses whether a student asked questionsin class or contributed to class discussions, and whetherthey made a class presentation.

Implementation Plan (timeline):

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program facultyEigenberg

Supporting Attachments:

Summary of Findings: Criminal Justice majors had amean of 3.43 and 2.29 on these items.

Target Achievement: Met

Recommendations : Try to identify some courses wherepresentations could be added.

Notes : The first item exceeded the goal but the secondfailed to meet it. Thus it appears that students haveopportunities for class discussions and questions, butnot for class presentations. This finding is a reflection oflarge class sizes.

Substantiating Evidence:

Student Reports of Oral Communication SkillsProgram level; Indirect - Survey

Details/Description: This survey is administered inBlackboard by the faculty member teaching CRMJ 485(Senior Seminar).

Target: 75% of students completing the Senior Surveywill agree that they have good oral communicationskills.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: EigenbergThompsonAll program faculty

Supporting Attachments:

Findings for Student Reports of OralCommunication Skills

Summary of Findings: No findings are available. Therewas a misunderstanding about how the survey was to beadministered.

Target Achievement: Not Met

Recommendations : Ensure survey is administered inSenior Seminar (CRMJ 485) next year.

Notes :

Substantiating Evidence:

6.2 Exposure to Oral CommunicationAssignments

Mapped to:No Mapping

Measures & Findings

Exposure to Oral PresentationsProgram level; Indirect - Other

Details/Description: This data is obtained from coursesyllabi and survey of faculty about their courserequirements.

Target: At least 25% of the courses taught will requireone oral presentation.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: EigenbergAll program faculty

Supporting Attachments:

Findings for Exposure to Oral Presentations

Summary of Findings: 13% of courses required at leastone oral presentation.

Target Achievement: Not Met

Recommendations : Continue to strive for 25%.

Notes : Figure may not be realistic given our high facultyto student ratios and large classes.

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

8 of 15 8/6/2010 12:05 PM

Page 18: Completed Academic Program Assessment Plans 20082009

6.3 Speak Effectively and Clearly

Mapped to:No Mapping

Measures & Findings

Speak EffectivelyProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: Criminal Justice majors will achieve a meanscore of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s National Surveyof Student Engagement which assesses whether astudent’s experience at UTC led them to report they canspeak clearly and effectively.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program facultyEigenberg

Supporting Attachments:

Findings for Speak Effectively

Summary of Findings: Criminal Justice majors had amean of 2.43 on this item.

Target Achievement: Met

Recommendations :

Notes : Given the small sample size, it is unlikely that adifference of .07 is signficant; therefore, it wasdetermined that this goal was met.

Substantiating Evidence:

7. Exposure to Diversity Issues inthe Curriculum

Outcomes

7.1 Integration of Diversity Issues inthe Curriculum

Mapped to:Strategic Initiative: Partnerships for Diversity:Measurable Outcome 1.5 Increased Tolerance

Measures & Findings

Exposure to Diversity IssuesProgram level; Indirect - Other

Details/Description: This data is taken from enrollmentdata and course catalog requirements.

Target: 100% of all students will take at least onecourse that expressly deals with diversity issues incriminal justice.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: EigenbergAll program faculty

Supporting Attachments:

Findings for Exposure to Diversity Issues

Summary of Findings: 100% of all students took atleast one course that expressly deals with diversityissues in criminal justice.

Target Achievement: Met

Recommendations :

Notes : Majors are required to take a course on eitherminorities and crime, gender and crime, or comparativecriminal justice.

Substantiating Evidence:

7.2 Integration of International Issuesin the Curriculum

Mapped to:

Measures & Findings

Exposure to Global IssuesProgram level; Indirect - Other

Findings for Exposure to Global Issues

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

9 of 15 8/6/2010 12:05 PM

Page 19: Completed Academic Program Assessment Plans 20082009

Strategic Initiative: Partnerships for Diversity:Measurable Outcome 1.5 Increased Tolerance Details/Description: This data is obtained from course

syllabi and survey of faculty about their courserequirements.

Target: At least 25% of the courses taught will requireat least one significant module on global/internationalissues in criminal justice.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: EigenbergAll program faculty

Supporting Attachments:

Summary of Findings: No findings are available. Therewas a misunderstanding about how the survey was to beadministered.

Target Achievement: Not Met

Recommendations : Ensure survey is administered inSenior Seminar (CRMJ 485) next year.

Notes :

Substantiating Evidence:

7.3 Integration of Ethical Issues in theCurriculum

Mapped to:Strategic Initiative: Partnerships for Diversity:Measurable Outcome 1.5 Increased Tolerance

Measures & Findings

Ethics and the CurriculumProgram level; Indirect - Other

Details/Description: This data is taken from enrollmentdata and course catalog requirements.

Target: 100% of all students will take at least onecourse that expressly deals with ethics in criminaljustice.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: EigenbergKnox

Supporting Attachments:

Findings for Ethics and the Curriculum

Summary of Findings: 37% (n=104) of our majors(n=283) took our ethics class (430) last academic year.

Target Achievement: Not Met

Recommendations :

Notes : The catalog was changed effective 2008. All newentering students will have taken at least one class thatexpressly deals with ethics in criminal justice as agraduation requirement.

Substantiating Evidence:

7.4 Exposure to Diverse People andPerspectives

Mapped to:No Mapping

Measures & Findings

Diversity Experiences in Person or in ClassProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: Criminal Justice majors will achieve a meanscore of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s Enrolled StudentSurvey that assesses whether they included diverseperspectives in class discussions or writing assignmentsand had serious conversations with students of adifferent race/ethnicity.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: EigenbergAll Program Faculty

Supporting Attachments:

Findings for Diversity Experiences in Person or inClass

Summary of Findings: Criminal Justice majors had amean of 2.86 and 2.71 on these items.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

10 of 15 8/6/2010 12:05 PM

Page 20: Completed Academic Program Assessment Plans 20082009

7.5 Increased Understanding of DiverseGroups

Mapped to:No Mapping

Measures & Findings

Exposure to Diverse PeopleProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: Criminal Justice majors will achieve a meanscore of 2.75 or higher on the Office of Planning,Evaluation, and Institutional Research’s National Surveyof Student Engagement which assesses whether astudent’s experience at UTC increased theirunderstanding of other racial and ethnic backgrounds.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program facultyEigenberg

Supporting Attachments:

Findings for Exposure to Diverse People

Summary of Findings: Criminal Justice majors had amean of 3.00 on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

8. Student Retention

Outcomes

8.1 Monitor High Risk Students

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 2.7Retention & Persistence

Measures & Findings

Intervention with High Risk StudentsProgram level; Indirect - Other

Details/Description: This data is being kept by ourChief Departmental Advisor.

Target: At least 50% of all majors with a GPA of lessthan 2.0 will meet with an advisor to discuss programprogress.

Implementation Plan (timeline): Fall 2009 and Spring2010

Key/Responsible Personnel: Thompson

Supporting Attachments:

Findings for Intervention with High Risk Students

Summary of Findings: No findings are available. Therewas a misunderstanding about how the survey was to beadministered.

Target Achievement: Not Met

Recommendations : Ensure survey is administered inSenior Seminar (CRMJ 485) next year.

Notes :

Substantiating Evidence:

8.2 Quality Advising

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 2.4Strong Commitment to Program, Measurable Outcome2.5 Student Engagement

Measures & Findings

Student Satisfaction with AdvisingProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Criminal Justice program will achieve amean score of 2.5 or higher on the Office of Planning,

Findings for Student Satisfaction with Advising

Summary of Findings: Criminal Justice majors had amean of 3.14 on this item.

Target Achievement: Exceeded

Recommendations :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

11 of 15 8/6/2010 12:05 PM

Page 21: Completed Academic Program Assessment Plans 20082009

Evaluation, and Institutional Research’s Enrolled StudentSurvey that assesses the quality of advising theyreceived at UTC.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchEigenbergThompsonGarlandKnoxHenselyBumphus

Supporting Attachments:

Notes :

Substantiating Evidence:

8.3 Scheduling

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 2.7Retention & Persistence, Measurable Outcome 4.4Retention/Graduation

Measures & Findings

Course Rotation PlanProgram level; Indirect - Other

Details/Description: This data is coming from courseoffering rosters.

Target: 100% of all required courses will be offered atleast once a year.

Implementation Plan (timeline): Fall 2008 and Spring2009

Key/Responsible Personnel: Eigenberg

Supporting Attachments:

Findings for Course Rotation Plan

Summary of Findings: 100% of all required courseswere offered at least once a year.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

8.4 Use Distance Learning to FacilitateAccess

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 4.1Distance/Alternative Delivery

Measures & Findings

Course Delivery Distance LearningProgram level; Indirect - Other

Details/Description: This data is coming from courseoffering rosters.

Target: Offer at least 6 distance learning coursesannually.

Implementation Plan (timeline): Fall 2008, Spring2009, Summer 2009

Key/Responsible Personnel: Eigenberg

Supporting Attachments:

Findings for Course Delivery Distance Learning

Summary of Findings: 7 distance learning courses wereoffered last year (fall, spring and summer).

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

8.5 Quality of Relationship betweenStudents and Faculty

Mapped to:

Measures & Findings

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

12 of 15 8/6/2010 12:05 PM

Page 22: Completed Academic Program Assessment Plans 20082009

No Mapping Student Faculty RelationshipsProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: Criminal Justice majors will achieve a meanscore of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s Enrolled StudentSurvey that assesses the quality of advising theyreceived at UTC.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program facultyEigenberg

Supporting Attachments:

Findings for Student Faculty Relationships

Summary of Findings: Criminal Justice majors had amean of 4.29 on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

9. Critical Thinking Skills

Outcomes

9.1 Integration of Ideas

Mapped to:No Mapping

Measures & Findings

Integrating Material from Various SourcesProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: Criminal Justice majors will achieve a meanscore of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s Enrolled StudentSurvey that assesses whether a student integratedideas or information from various sources, variouscourses, or during class discussions.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All Program facultyEigenberg

Supporting Attachments:

Findings for Integrating Material from VariousSources

Summary of Findings: Criminal Justice majors had amean of 3.14 and 2.71 on these items.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

9.2 Analyzing and SynthesizingInformation

Mapped to:No Mapping

Measures & Findings

Analyzing and SynthesizingProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: Criminal Justice majors will achieve a mean

Findings for Analyzing and Synthesizing

Summary of Findings: Criminal Justice majors had amean of 3.14 and 2.86 on these items.

Target Achievement: Exceeded

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

13 of 15 8/6/2010 12:05 PM

Page 23: Completed Academic Program Assessment Plans 20082009

score of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s Enrolled StudentSurvey that assesses whether a student analyzed thebasic elements of an idea, experience or theory andwhether they synthesized information intom morecomplex ideas.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program facultyEigenberg

Supporting Attachments:

Recommendations :

Notes :

Substantiating Evidence:

9.3 Making Judgements

Mapped to:No Mapping

Measures & Findings

JudgementProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: Criminal Justice majors will achieve a meanscore of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s Enrolled StudentSurvey that assesses whether a student madejudgements about information and data.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program facultyEigenberg

Supporting Attachments:

Findings for Judgement

Summary of Findings: Criminal Justice majors had amean of 2.71 on this item.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

9.4 Reported Ability to Think Criticallyand Analytical

Mapped to:No Mapping

Measures & Findings

Ability to Think CriticallyProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: Criminal Justice majors will achieve a meanscore of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s Enrolled StudentSurvey that assesses whether a student’s experience atUTC added to their ability to think clearly andanalytically.

Implementation Plan (timeline): 2008-09

Findings for Ability to Think Critically

Summary of Findings: Criminal Justice majors had amean of 3.14 on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

14 of 15 8/6/2010 12:05 PM

Page 24: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: All program facultyEigenberg

Supporting Attachments:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

15 of 15 8/6/2010 12:05 PM

Page 25: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Criminal Justice: MSCJ

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

1. Master Core Material

Outcomes

1.1 Master and Integrate Core Material

Mapped to:No Mapping

Measures & Findings

ThesesProgram level; Direct - Exam

Details/Description: Data is calculated by the GraduateProgram Coordinator using comprehensive exam results.

Target: At least 75% of the students will pass theCriminal Justice Master’s comprehensive exam on theirfirst attempt and demonstrate mastery of core material,use of critical thinking skills and application of researchskills.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: IlesAll program faculty

Supporting Attachments:

Findings for Theses

Summary of Findings: 86% of the graduates passed theCriminal Justice Master’s comprehensive exam on theirfirst attempt.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Thesis CompletionProgram level; Direct - Student Artifact

Details/Description: Data is calculated by the GraduateProgram Coordinator using completed theses.

Target: 75% of all students who attempt a thesis willcomplete it and will demonstrate mastery of corematerial, use of critical thinking skills and application ofresearch skills.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: IlesEigenbergAll program faculty

Supporting Attachments:

Findings for Thesis Completion

Summary of Findings: 75% of students who attempteda thesis (through the thesis class or by starting aprospectus) completed it.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 5 8/6/2010 12:06 PM

Page 26: Completed Academic Program Assessment Plans 20082009

2. High Quality EducationalExperience

Outcomes

2.1 Quality Experience for Students

Mapped to:No Mapping

Measures & Findings

Student SatisfactionProgram level; Indirect - Survey

Details/Description: This survey is administered by theGraduate Coordinator.

Target: 75% of students completing the graduate exitsurvey will report they are satisfied with the overallquality of their experience.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program facultyIles

Supporting Attachments:

Findings for Student Satisfaction

Summary of Findings: 100% of the graduates reportedthey were satisfied or very satisfied with the overallquality of their experience.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

2.2 Admission To Doctoral Programs

Mapped to:No Mapping

Measures & Findings

Admission to PhD programsProgram level; Direct - Student Artifact

Details/Description: Data is calculated by the GraduateProgram Coordinator using a ratio of number of studentsknown to apply for a PhD to those accepted.

Target: 75% of gradutes who choose to pursue a PhDwill be accepted into a program.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program facultyIles

Supporting Attachments:

Findings for Admission to PhD programs

Summary of Findings: 100% of gradutes (n=2) whochose to pursue a PhD were accepted into a program.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

3. Critical Thinking Skills

Outcomes

3.1 Student Assessment of CriticalThinking Skills

Mapped to:No Mapping

Measures & Findings

Student Assessment of Critical ThinkingIndirect - Survey

Details/Description: This survey is administered by the

Findings for Student Assessment of CriticalThinking

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 5 8/6/2010 12:06 PM

Page 27: Completed Academic Program Assessment Plans 20082009

Graduate Coordinator.

Target: 75% of the graduates completing the exit examwill indicate they are satisfied with the extent to whichthe program increased their ability to think critically.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: IlesAll program faculty

Supporting Attachments:

Summary of Findings: 100% of the graduatescompleting the exit exam were satisified or verysatisfied on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

4. Mastery of Writing Skills

Outcomes

4.1 Student Satisfaction with WritingSkills

Mapped to:No Mapping

Measures & Findings

Student Assessment of Writing SkillsIndirect - Survey

Details/Description: This survey is administered by theGraduate Coordinator.

Target: 75% of the graduates completing the exit examwill indicate they are satisfied with the extent to whichthe program increased their ability to write.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program staffIles

Supporting Attachments:

Findings for Student Assessment of Writing Skills

Summary of Findings: 100% of the graduatescompleting the exit exam were satisified or verysatisfied on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

4.2 Exposure to Writing Assignments

Mapped to:No Mapping

Measures & Findings

Writing AssignmentsProgram level; Direct - Other

Details/Description: This data is obtained from coursesyllabi and survey of faculty about their courserequirements.

Target: 100% of all graduate courses will have at leastone significant writing assignment.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program staffIles

Supporting Attachments:

Findings for Writing Assignments

Summary of Findings: 100% of all graduate courses hadat least one major writing assignment or a multitude ofsmaller assignments that were the equivalent of onelarger assignment.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

5. Mastery of Communication Skills

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 5 8/6/2010 12:06 PM

Page 28: Completed Academic Program Assessment Plans 20082009

Outcomes

5.1 Student Assessment ofCommunication Skills

Mapped to:No Mapping

Measures & Findings

Student Assessment of Oral CommunicationProgram level; Indirect - Survey

Details/Description: This survey is administered by theGraduate Coordinator.

Target: 75% of the graduates completing the exit examwill indicate they are satisfied with the extent to whichthe program increased their oral communications skills.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program facultyIles

Supporting Attachments:

Findings for Student Assessment of OralCommunication

Summary of Findings: 100% of the graduatescompleting the exit exam were satisified or verysatisfied on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

5.2 Exposure to Oral Presentations

Mapped to:No Mapping

Measures & Findings

Oral PresentationsProgram level; Direct - Student Artifact

Details/Description: This data is obtained from coursesyllabi and survey of faculty about their courserequirements.

Target: 100% of students will have at least onesignificant oral presentation in their program.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program facultyIles

Supporting Attachments:

Findings for Oral Presentations

Summary of Findings: 100% of all graduates had atleast one signfiicant oral presentation.

Target Achievement: Met

Recommendations :

Notes : Students had oral presentations in CRMJ 502,503, 513, 516 596 and for all theses in 2008-09.

Substantiating Evidence:

6. Mastery of Research Skills

Outcomes

6.1 Student Assessment of ResearchSkills

Mapped to:No Mapping

Measures & Findings

Student Assessment of Research SkillsProgram level; Indirect - Survey

Details/Description: This survey is administered by theGraduate Coordinator.

Target: 75% of the graduates completing the exit examwill indicate they are satisfied with their mastery ofresearch skills.

Findings for Student Assessment of Research Skills

Summary of Findings: 100% of the graduatescompleting the exit exam were satisified or verysatisfied with their ability to evaluate research andconduct research.

Target Achievement: Exceeded

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 5 8/6/2010 12:06 PM

Page 29: Completed Academic Program Assessment Plans 20082009

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program facultyIles

Supporting Attachments:

Recommendations :

Notes :

Substantiating Evidence:

6.2 Application of Research Skills

Mapped to:No Mapping

Measures & Findings

Demonstration of Research SkillsProgram level; Direct - Student Artifact

Details/Description: Graduate Coordinator monitorscomprehensive exams to ensure that there is anapplication of research methods/skills. She alsomonitors theses completed to ensure this requirement ismet as well.

Target: 100% of students will be required to use/applyresearch method skills and interpret data either incomprehensive exams or theses.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program facultyIles

Supporting Attachments:

Findings for Demonstration of Research Skills

Summary of Findings: 100% of students demonstratedtheir mastery of research skills by successfullycompleting their thesis or by passing the comprehensiveexam.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

5 of 5 8/6/2010 12:06 PM

Page 30: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Foreign Languages: BA

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Foreign Languages: BA French,Spanish

Outcomes

I.1 SPEAKING

Mapped to:National Standards in Foreign LanguageEducation: Standard 1.1: Conversations

Measures & Findings

Auralog pre and post-test - FrenchCourse level; Direct - Exam

Details/Description: TellMeMore is a software programwith built-in voice recognition that measures oral andaural comprehension as well as grammar, culture, andreading skills. This is produced by the company Auralog.

Target: Incoming first-year students are expected toscore between 1.0 and 1.6 on the 10-point scale. At theend of the first year they should score between 2.0-2.6.Incoming second-year students are expected to performbetween 2.0-2.6. At the end of the second year theyshould score between 3.0-3.6.

Implementation Plan (timeline): This was administeredat the beginning of the first and second-year coursesand at the end of the first and second-year sequences.

Key/Responsible Personnel: Dr. Felicia B. SturzerActing Dept. Head

Supporting Attachments:Auralog scores correlated to EU standards (Microsoft

Word)This chart compares the scores of pre and post tests toCouncil of Europe Standards

Findings for Auralog pre and post-test - French

Summary of Findings: The findings at this introductorylevel are as expected. The lowest scores coming inspent less time working in the software but, due toclassroom instruction, still made the greatest gains.Those with the highest incoming scores yielded thelowest gains because language acquisition slows as thematerial gets more difficult. The two groups in themiddle also performed as expected in that the groupthat consistently worked harder, as measured by time inthe software, made greater strides than the group thatworked less. Overall, increases in gain diminished asscores increased, reflecting the increasing difficulty ofthe material.As an absolute measure of progress, students met ourexpectations.At the second year level, lower performing studentsspent substantially more time in the software and mademore substantial improvement. It must be noted thatthose scoring a 1, simply did not engage with theinstrument, so improvements at the lowest level aresuspect and those with 1s should have been discountedin the measures.

Target Achievement: Met

Recommendations : Since, at the introductory levelthere is no remarkable difference in gain between groupsthat worked more, as measured in time, from those whoworked less, we conclude that the software provideslittle added value when measured against time on task.At the intermediate level, supplemental work did seemto yield better results.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 48 8/6/2010 12:09 PM

Page 31: Completed Academic Program Assessment Plans 20082009

Notes :

Substantiating Evidence:Pre and Post test in Auralog for FREN 211-212 (Word

Document (Open XML))Pre and Post test results in Auralog FREN

101-102 (Word Document (Open XML))

Auralog pre and post-test - SpanishCourse level; Direct - Exam

Details/Description: TellMeMore is a software programwith built-in voice recognition that measures oral andaural comprehension as well as grammar, culture, andreading skills. This is produced by the company Auralog.

Target: Incoming first-year students are expected toscore between 1.0 and 1.6 on the 10-point scale. At theend of the first year they should score between 2.0-2.6.Incoming second-year students are expected to performbetween 2.0-2.6. At the end of the second year theyshould score between 3.0-3.6.

Implementation Plan (timeline): This was administeredat the beginning of the first and second-year coursesand at the end of the first and second-year sequences.

Key/Responsible Personnel: Dr. Felicia B. SturzerActing Dept. Head

Supporting Attachments:

No Findings Added to Auralog pre and post-test -Spanish

End of Course Oral Exams/SOPI - SpanishCourse level; Direct - Exam

Details/Description: End of course oral exams or theSOPI, (Simulated Oral Proficiency Interview), are used toassess speaking proficiency and/or mastery of coursespeaking objectives. All tests are recorded in instructorGradebook. 311/312 exams are recorded on tapesmaintained in the department.

Target: By the end of elementary Spanish students willbe introduced to this task and can perform at the novielevel. 85% of students in Spanish 101, 102, and 211 willreceive grades of 70% or better on end of course oralexams. By the end of intermediate Spanish (212), 85%of students will perform at the Novice High orIntermediate Low level. After completing both Spanish311 and 312 (the courses may be taken in any order),85% of students perform at the intermediate low levelor higher.

Implementation Plan (timeline): Students areadministered oral exams by their instructors at the endof Spanish 101, 102, 211. At the end of 212, 311, and312, students will receive an SOPI.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Dept. Head

Supporting Attachments:

Findings for End of Course Oral Exams/SOPI -Spanish

Summary of Findings: There were no SOPI findings forSpanish 212 and 312, because of new instructors with alack of training. However, end of course oral examsindicate that at least 85% of students are scored 70%or better on end of course exams in Spanish 101, 102,211, 212, 311, and 312 courses.

Target Achievement: Met

Recommendations : Training for new instructors of 211,212, 311, and 312, as well as insistence on end ofcourse oral exams.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 48 8/6/2010 12:09 PM

Page 32: Completed Academic Program Assessment Plans 20082009

Spanish 101 Oral interview (Microsoft Word)This is the guidelines for the interview along with a rubric.

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)This is the survey administered to all students in the firstand second year language sequences

Findings for I Can survey

Summary of Findings: Students met this goal with atleast 75% proficiency.

Target Achievement: Met

Recommendations : Continue to promote this skill.

Notes :

Substantiating Evidence:I Can survey (Microsoft Word)Results of student survey 2008-2009.

SOPI Test (Simulated Oral ProficiencyInterview) - FrenchCourse level; Direct - Exam

Details/Description: This test is used by governmentagencies and by the American Council on the Teaching ofForeign Languages (ACTFL) to assess speakingproficiency in a target language. The test is recordedand assessed by instructors who have been trained inevaluation by this method.

It assesses speaking ability from Novice throughSuperior. Descriptions of the various levels are attached.

Target: By the end of the intermediate conversationcourse (FREN/SPAN 212) students should attain a NoviceHigh-Intermediate Low rating.

By the end of the composition-conversation course(FREN 312/SPAN 312) students should attainIntermediate Low - High rating.

Implementation Plan (timeline): The test isadministered at the conclusion of FREN or SPAN 212 and312 respectively.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:SOPI Proficiency Ratings from ACTFL (Adobe Acrobat

Document)Presentation of the various skills levels, from novice tosuperior, of the oral proficiency measures against which wemeasure results of the Simulated Oral Proficiency Exam.

Findings for SOPI Test (Simulated Oral ProficiencyInterview) - French

Summary of Findings: At the end of the second year inFrench, 7% of students scored at the Novice Mid level,14% at the Intermediate Low level, 29% IntermediateMid and 50% at Intermediate High.

Therefore, 93% met our goal of NoviceHigh-Intermediate Low at the conclusion of thesecond-year French course (French 212).

At the end of the French 312 course (Composition andConversation), 14% scored at the Intermediate Lowlevel, 57% at the Intermediate Mid level, 14% atIntermediate High, and 14% at the Advanced level.

Therefore 100% met our standard.

Target Achievement: Exceeded

Recommendations : We recommend that the Outcomelevel for second-year French should be raised toperformance at the Intermediate Low level.

We recommend that the Outcome level for third-yearFrench be raised to the Intermediate Mid level.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 48 8/6/2010 12:09 PM

Page 33: Completed Academic Program Assessment Plans 20082009

I.2 SPEAKING

Mapped to:National Standards in Foreign LanguageEducation: Standard 1.3: Presentation

Measures & Findings

Auralog pre and post-test - FrenchCourse level; Direct - Exam

Details/Description: TellMeMore is a software programwith built-in voice recognition that measures oral andaural comprehension as well as grammar, culture, andreading skills. This is produced by the company Auralog.

Target: Incoming first-year students are expected toscore between 1.0 and 1.6 on the 10-point scale. At theend of the first year they should score between 2.0-2.6.Incoming second-year students are expected to performbetween 2.0-2.6. At the end of the second year theyshould score between 3.0-3.6.

Implementation Plan (timeline): This was administeredat the beginning of the first and second-year coursesand at the end of the first and second-year sequences.

Key/Responsible Personnel: Dr. Felicia B. SturzerActing Dept. Head

Supporting Attachments:Auralog scores correlated to EU standards (Microsoft

Word)

Findings for Auralog pre and post-test - French

Summary of Findings: The findings at this introductorylevel are as expected. The lowest scores coming inspent less time working in the software but, due toclassroom instruction, still made the greatest gains.Those with the highest incoming scores yielded thelowest gains because language acquisition slows as thematerial gets more difficult. The two groups in themiddle also performed as expected in that the groupthat consistently worked harder, as measured by time inthe software, made greater strides than the group thatworked less. Overall, increases in gain diminished asscores increased, reflecting the increasing difficulty ofthe material.As an absolute measure of progress, students met ourexpectations.At the second year level, lower performing studentsspent substantially more time in the software and mademore substantial improvement. It must be noted thatthose scoring a 1, simply did not engage with theinstrument, so improvements at the lowest level aresuspect and those with 1s should have been discountedin the measures.

Target Achievement: Met

Recommendations : Since there is no remarkabledifference in gain between groups that worked more, asmeasured in time, from those who worked less, weconclude that the software provides little added valuewhen measured against time on task.At the intermediate level, supplemental work did seemto yield better results.

Notes :

Substantiating Evidence:Pre and Post test in Auralog for FREN 211-212 (Word

Document (Open XML))Pre and Post test results in Auralog FREN

101-102 (Word Document (Open XML))

End of Course Oral Exams and SOPI - SpanishCourse level; Direct - Exam

Details/Description: End of course oral exams or theSOPI, (Simulated Oral Proficiency Interview), are used toassess speaking proficiency and/or mastery of coursespeaking objectives. All tests are recorded in instructorGradebook. 311/312 exams are recorded on tapesmaintained in the department.

Target: By the end of elementary Spanish students willbe introduced to this task and can perform at the novielevel. 85% of students in Spanish 101, 102, and 211 willreceive grades of 70% or better on end of course oral

Findings for End of Course Oral Exams and SOPI -Spanish

Summary of Findings: There were no SOPI findings forSpanish 212 and 312, because of new instructors with alack of training. However, end of course oral examsindicate that at least 85% of students are scored 70%or better on end of course exams in Spanish 101, 102,211, 212, 311, and 312 courses.

Target Achievement: Met

Recommendations : Training for Spanish 211, 212, 311,

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 48 8/6/2010 12:09 PM

Page 34: Completed Academic Program Assessment Plans 20082009

exams. By the end of intermediate Spanish (212), 85%of students will perform at the Novice High orIntermediate Low level. After completing both Spanish311 and 312 (the courses may be taken in any order),85% of students perform at the intermediate low levelor higher.

Implementation Plan (timeline): Students areadministered oral exams by their instructors at the endof Spanish 101, 102, 211. At the end of 212, 311, and312, students will receive an SOPI.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Dept. Head

Supporting Attachments:oral interview 101 (Microsoft Word)Here are the guidelines for the oral interview along with arubric.

312 instructors.

Notes :

Substantiating Evidence:

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage courses.

Findings for I Can survey

Summary of Findings: Students met this goal with 75%proficiency.

Target Achievement: Met

Recommendations : Continue to teach this skill ascurrently done.

Notes :

Substantiating Evidence:I Can survey (Microsoft Word)Results of student survey 2008-2009.

Simulated conversations FREN 211Course level; Direct - Student Artifact

Details/Description: Students participate in simulatedconversations with a specific communicative goal. Thiscomprises 20% of their final grade in this course.

Target: Students will achieve 75% proficiency atcommunication according to task assigned.

Implementation Plan (timeline): Throughout theacademic year, 4 times per semester.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:Sample communicative assignment FREN 211

Intermediate French for Conversation (Microsoft Word)Sample task and grading rubric for communicative task.

Findings for Simulated conversations FREN 211

Summary of Findings: By the end of the secondsemester, students were consistently 75% proficient.

Target Achievement: Met

Recommendations : Continue to build students to 75%proficiency.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

5 of 48 8/6/2010 12:09 PM

Page 35: Completed Academic Program Assessment Plans 20082009

SOPI Test (Simulated Oral ProficiencyInterview) - FrenchCourse level; Direct - Exam

Details/Description: This test is used by governmentagencies and by the American Council on the Teaching ofForeign Languages (ACTFL) to assess speakingproficiency in a target language. The test is recordedand assessed by instructors who have been trained inevaluation by this method.

It assesses speaking ability from Novice throughSuperior. Descriptions of the various levels are attached.

Target: By the end of the intermediate conversationcourse (FREN/SPAN 212) students should attain a NoviceHigh-Intermediate Low rating.

By the end of the composition-conversation course(FREN 312/SPAN 312) students should attainIntermediate Low - High rating.

Implementation Plan (timeline): The test isadministered at the conclusion of FREN or SPAN 212 and312 respectively.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:SOPI Proficiency Ratings from ACTFL (Adobe Acrobat

Document)

Findings for SOPI Test (Simulated Oral ProficiencyInterview) - French

Summary of Findings: At the end of the second year inFrench, 7% of students scored at the Novice Mid level,14% at the Intermediate Low level, 29% IntermediateMid and 50% at Intermediate High.

Therefore, 93% met our goal of NoviceHigh-Intermediate Low at the conclusion of thesecond-year French course (French 212).

At the end of the French 312 course (Composition andConversation), 14% scored at the Intermediate Lowlevel, 57% at the Intermediate Mid level, 14% atIntermediate High, and 14% at the Advanced level.

Therefore 100% met our standard.

Target Achievement: Exceeded

Recommendations : We recommend that the Outcomelevel for second-year French should be raised toperformance at the Intermediate Low level.

We recommend that the Outcome level for third-yearFrench be raised to the Intermediate Mid level.

Notes :

Substantiating Evidence:

1.3 SPEAKING

Mapped to:National Standards in Foreign LanguageEducation: Standard 1.2: Written & Spoken

Measures & Findings

Auralog pre and post-test - FrenchCourse level; Direct - Exam

Details/Description: TellMeMore is a software programwith built-in voice recognition that measures oral andaural comprehension as well as grammar, culture, andreading skills. This is produced by the company Auralog.

Target: Incoming first-year students are expected toscore between 1.0 and 1.6 on the 10-point scale. At theend of the first year they should score between 2.0-2.6.Incoming second-year students are expected to performbetween 2.0-2.6. At the end of the second year theyshould score between 3.0-3.6.

Implementation Plan (timeline): This was administeredat the beginning of the first and second-year coursesand at the end of the first and second-year sequences.

Key/Responsible Personnel: Dr. Felicia B. SturzerActing Dept. Head

Supporting Attachments:Auralog scores correlated to EU standards (Microsoft

Word)

Findings for Auralog pre and post-test - French

Summary of Findings: The findings at this introductorylevel are as expected. The lowest scores coming inspent less time working in the software but, due toclassroom instruction, still made the greatest gains.Those with the highest incoming scores yielded thelowest gains because language acquisition slows as thematerial gets more difficult. The two groups in themiddle also performed as expected in that the groupthat consistently worked harder, as measured by time inthe software, made greater strides than the group thatworked less. Overall, increases in gain diminished asscores increased, reflecting the increasing difficulty ofthe material.As an absolute measure of progress, students met ourexpectations.At the second year level, lower performing studentsspent substantially more time in the software and mademore substantial improvement. It must be noted thatthose scoring a 1, simply did not engage with theinstrument, so improvements at the lowest level are

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

6 of 48 8/6/2010 12:09 PM

Page 36: Completed Academic Program Assessment Plans 20082009

suspect and those with 1s should have been discountedin the measures.

Target Achievement: Met

Recommendations : Since there is no remarkabledifference in gain between groups that worked more, asmeasured in time, from those who worked less, weconclude that the software provides little added valuewhen measured against time on task.At the intermediate level, supplemental work did seemto yield better results.

Notes :

Substantiating Evidence:Pre and Post test in Auralog for FREN 211-212 (Word

Document (Open XML))Pre and Post test results in Auralog FREN

101-102 (Word Document (Open XML))

End of Course Oral Exams or SOPI - SpanishCourse level; Direct - Exam

Details/Description: End of course oral exams or theSOPI, (Simulated Oral Proficiency Interview), are used toassess speaking proficiency and/or mastery of coursespeaking objectives. All tests are recorded in instructorGradebook. 311/312 exams are recorded on tapesmaintained in the department.

Target: By the end of elementary Spanish students willbe introduced to this task and can perform at the novielevel. 85% of students in Spanish 101, 102, and 211 willreceive grades of 70% or better on end of course oralexams. By the end of intermediate Spanish (212), 85%of students will perform at the Novice High orIntermediate Low level. After completing both Spanish311 and 312 (the courses may be taken in any order),85% of students perform at the intermediate low levelor higher.

Implementation Plan (timeline): Students areadministered oral exams by their instructors at the endof Spanish 101, 102, 211. At the end of 212, 311, and312, students will receive an SOPI.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Dept. Head

Supporting Attachments:Oral Interview Spanish 101 (Microsoft Word)Here are the guidelines for the oral interview along with arubric.

Findings for End of Course Oral Exams or SOPI -Spanish

Summary of Findings: There were no SOPI findings forSpanish 212 and 312, because of new instructors with alack of training. However, end of course oral examsindicate that at least 85% of students are scored 70%or better on end of course exams in Spanish 101, 102,211, 212, 311, and 312 courses.

Target Achievement: Met

Recommendations : Training for 211, 212, 311, and 312instructors on the SOPI.

Notes :

Substantiating Evidence:

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Findings for I Can survey

Summary of Findings: Students were 45% proficient bythe end of the second year.

Target Achievement: Not Met

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

7 of 48 8/6/2010 12:09 PM

Page 37: Completed Academic Program Assessment Plans 20082009

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage courses.

Recommendations : Continue to build this skill.

Notes :

Substantiating Evidence:I Can survey (Microsoft Word)Results of student survey 2008-2009.

SOPI Test (Simulated Oral ProficiencyInterview) - FrenchCourse level; Direct - Exam

Details/Description: This test is used by governmentagencies and by the American Council on the Teaching ofForeign Languages (ACTFL) to assess speakingproficiency in a target language. The test is recordedand assessed by instructors who have been trained inevaluation by this method.

It assesses speaking ability from Novice throughSuperior. Descriptions of the various levels are attached.

Target: By the end of the intermediate conversationcourse (FREN/SPAN 212) students should attain a NoviceHigh-Intermediate Low rating.

By the end of the composition-conversation course(FREN 312/SPAN 312) students should attainIntermediate Low - High rating.

Implementation Plan (timeline): The test isadministered at the conclusion of FREN or SPAN 212 and312 respectively.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:SOPI Proficiency Ratings from ACTFL (Adobe Acrobat

Document)

Findings for SOPI Test (Simulated Oral ProficiencyInterview) - French

Summary of Findings: At the end of the second year inFrench, 7% of students scored at the Novice Mid level,14% at the Intermediate Low level, 29% IntermediateMid and 50% at Intermediate High.

Therefore, 93% met our goal of NoviceHigh-Intermediate Low at the conclusion of thesecond-year French course (French 212).

At the end of the French 312 course (Composition andConversation), 14% scored at the Intermediate Lowlevel, 57% at the Intermediate Mid level, 14% atIntermediate High, and 14% at the Advanced level.

Therefore 100% met our standard.

Target Achievement: Exceeded

Recommendations : We recommend that the Outcomelevel for second-year French should be raised toperformance at the Intermediate Low level.

We recommend that the Outcome level for third-yearFrench be raised to the Intermediate Mid level.

Notes :

Substantiating Evidence:

1.4 SPEAKING

Mapped to:National Standards in Foreign LanguageEducation: Standard 1.1: Conversations

Measures & Findings

Auralog pre and post-test - FrenchCourse level; Direct - Exam

Details/Description: TellMeMore is a software programwith built-in voice recognition that measures oral andaural comprehension as well as grammar, culture, andreading skills. This is produced by the company Auralog.

Target: Incoming first-year students are expected to

Findings for Auralog pre and post-test - French

Summary of Findings: The findings at this introductorylevel are as expected. The lowest scores coming inspent less time working in the software but, due toclassroom instruction, still made the greatest gains.Those with the highest incoming scores yielded the

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

8 of 48 8/6/2010 12:09 PM

Page 38: Completed Academic Program Assessment Plans 20082009

score between 1.0 and 1.6 on the 10-point scale. At theend of the first year they should score between 2.0-2.6.Incoming second-year students are expected to performbetween 2.0-2.6. At the end of the second year theyshould score between 3.0-3.6.

Implementation Plan (timeline): This was administeredat the beginning of the first and second-year coursesand at the end of the first and second-year sequences.

Key/Responsible Personnel: Dr. Felicia B. SturzerActing Dept. Head

Supporting Attachments:Auralog scores correlated to EU standards (Microsoft

Word)

lowest gains because language acquisition slows as thematerial gets more difficult. The two groups in themiddle also performed as expected in that the groupthat consistently worked harder, as measured by time inthe software, made greater strides than the group thatworked less. Overall, increases in gain diminished asscores increased, reflecting the increasing difficulty ofthe material.As an absolute measure of progress, students met ourexpectations.At the second year level, lower performing studentsspent substantially more time in the software and mademore substantial improvement. It must be noted thatthose scoring a 1, simply did not engage with theinstrument, so improvements at the lowest level aresuspect and those with 1s should have been discountedin the measures.

Target Achievement: Met

Recommendations : Since there is no remarkabledifference in gain between groups that worked more, asmeasured in time, from those who worked less, weconclude that the software provides little added valuewhen measured against time on task.At the intermediate level, supplemental work did seemto yield better results.

Notes :

Substantiating Evidence:Pre and Post test in Auralog for FREN 211-212 (Word

Document (Open XML))Pre and Post test results in Auralog FREN

101-102 (Word Document (Open XML))

End of Course Oral Exams or SOPI - SpanishCourse level; Direct - Exam

Details/Description: End of course oral exams or theSOPI, (Simulated Oral Proficiency Interview), are used toassess speaking proficiency and/or mastery of coursespeaking objectives. All tests are recorded in instructorGradebook. 311/312 exams are recorded on tapesmaintained in the department.

Target: By the end of elementary Spanish students willbe introduced to this task and can perform at the novielevel. 85% of students in Spanish 101, 102, and 211 willreceive grades of 70% or better on end of course oralexams. By the end of intermediate Spanish (212), 85%of students will perform at the Novice High orIntermediate Low level. After completing both Spanish311 and 312 (the courses may be taken in any order),85% of students perform at the intermediate low levelor higher.

Implementation Plan (timeline): Students areadministered oral exams by their instructors at the endof Spanish 101, 102, 211. At the end of 212, 311, and

Findings for End of Course Oral Exams or SOPI -Spanish

Summary of Findings: There were no SOPI findings forSpanish 212 and 312, because of new instructors with alack of training. However, end of course oral examsindicate that at least 85% of students are scored 70%or better on end of course exams in Spanish 101, 102,211, 212, 311, and 312 courses.

Target Achievement: Met

Recommendations : Training for 211, 212, 311, 312instructors on SOPI.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

9 of 48 8/6/2010 12:09 PM

Page 39: Completed Academic Program Assessment Plans 20082009

312, students will receive an SOPI.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Dept. Head

Supporting Attachments:oral interview Spanish 101 (Microsoft Word)Here are the guidelines for the oral interview along with arubric.

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage courses.

Findings for I Can survey

Summary of Findings: Depending upon which detail oneexamines students were 55-70% confident in this skill.

Target Achievement: Not Met

Recommendations : Continue to develop this skill.

Notes :

Substantiating Evidence:I Can survey (Microsoft Word)Results of student survey 2008-2009.

SOPI Test (Simulated Oral ProficiencyInterview) - FrenchCourse level; Direct - Exam

Details/Description: This test is used by governmentagencies and by the American Council on the Teaching ofForeign Languages (ACTFL) to assess speakingproficiency in a target language. The test is recordedand assessed by instructors who have been trained inevaluation by this method.

It assesses speaking ability from Novice throughSuperior. Descriptions of the various levels are attached.

Target: By the end of the intermediate conversationcourse (FREN/SPAN 212) students should attain a NoviceHigh-Intermediate Low rating.

By the end of the composition-conversation course(FREN 312/SPAN 312) students should attainIntermediate Low - High rating.

Implementation Plan (timeline): The test isadministered at the conclusion of FREN or SPAN 212 and312 respectively.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:

Findings for SOPI Test (Simulated Oral ProficiencyInterview) - French

Summary of Findings: At the end of the second year inFrench, 7% of students scored at the Novice Mid level,14% at the Intermediate Low level, 29% IntermediateMid and 50% at Intermediate High.

Therefore, 93% met our goal of NoviceHigh-Intermediate Low at the conclusion of thesecond-year French course (French 212).

At the end of the French 312 course (Composition andConversation), 14% scored at the Intermediate Lowlevel, 57% at the Intermediate Mid level, 14% atIntermediate High, and 14% at the Advanced level.

Therefore 100% met our standard.

Target Achievement: Exceeded

Recommendations : We recommend that the Outcomelevel for second-year French should be raised toperformance at the Intermediate Low level.

We recommend that the Outcome level for third-yearFrench be raised to the Intermediate Mid level.

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

10 of 48 8/6/2010 12:09 PM

Page 40: Completed Academic Program Assessment Plans 20082009

SOPI Proficiency Ratings from ACTFL (Adobe AcrobatDocument)

Substantiating Evidence:

Task oriented dialogue-FrenchCourse level; Direct - Student Artifact

Details/Description: Students are given a specificcommunicative task for a dialogue with a fellow studentand the rubric by which the quality of communication willbe judged.

Target: Students will demonstrate 75% proficiency inthe communicative task.

Implementation Plan (timeline): Throughout the year,4 times per semester.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept Head

Supporting Attachments:Sample grading rubric for communicative task FREN

211-212 (Microsoft Word)Here students must persuade a fellow student toaccompany them to France in the summer.

Findings for Task oriented dialogue-French

Summary of Findings: By the end of the secondsemester, students were 75% proficient.

Target Achievement: Met

Recommendations : Continue to build this skill.

Notes :

Substantiating Evidence:

I.5 SPEAKING

Mapped to:National Standards in Foreign LanguageEducation: Standard 1.3: Presentation

Measures & Findings

Auralog pre and post-test - FrenchCourse level; Direct - Exam

Details/Description: TellMeMore is a software programwith built-in voice recognition that measures oral andaural comprehension as well as grammar, culture, andreading skills. This is produced by the company Auralog.

Target: Incoming first-year students are expected toscore between 1.0 and 1.6 on the 10-point scale. At theend of the first year they should score between 2.0-2.6.Incoming second-year students are expected to performbetween 2.0-2.6. At the end of the second year theyshould score between 3.0-3.6.

Implementation Plan (timeline): This was administeredat the beginning of the first and second-year coursesand at the end of the first and second-year sequences.

Key/Responsible Personnel: Dr. Felicia B. SturzerActing Dept. Head

Supporting Attachments:Auralog scores correlated to EU standards (Microsoft

Word)

Findings for Auralog pre and post-test - French

Summary of Findings: The findings at this introductorylevel are as expected. The lowest scores coming inspent less time working in the software but, due toclassroom instruction, still made the greatest gains.Those with the highest incoming scores yielded thelowest gains because language acquisition slows as thematerial gets more difficult. The two groups in themiddle also performed as expected in that the groupthat consistently worked harder, as measured by time inthe software, made greater strides than the group thatworked less. Overall, increases in gain diminished asscores increased, reflecting the increasing difficulty ofthe material.As an absolute measure of progress, students met ourexpectations.At the second year level, lower performing studentsspent substantially more time in the software and mademore substantial improvement. It must be noted thatthose scoring a 1, simply did not engage with theinstrument, so improvements at the lowest level aresuspect and those with 1s should have been discountedin the measures.

Target Achievement: Met

Recommendations : Since there is no remarkabledifference in gain between groups that worked more, asmeasured in time, from those who worked less, we

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

11 of 48 8/6/2010 12:09 PM

Page 41: Completed Academic Program Assessment Plans 20082009

conclude that the software provides little added valuewhen measured against time on task.At the intermediate level, supplemental work did seemto yield better results.

Notes :

Substantiating Evidence:Pre and Post test in Auralog for FREN 211-212 (Word

Document (Open XML))Pre and Post test results in Auralog FREN

101-102 (Word Document (Open XML))

End of Course Oral Exams or SOPI - SpanishCourse level; Direct - Exam

Details/Description: End of course oral exams or theSOPI, (Simulated Oral Proficiency Interview), are used toassess speaking proficiency and/or mastery of coursespeaking objectives. All tests are recorded in instructorGradebook. 311/312 exams are recorded on tapesmaintained in the department.

Target: By the end of elementary Spanish students willbe introduced to this task and can perform at the novielevel. 85% of students in Spanish 101, 102, and 211 willreceive grades of 70% or better on end of course oralexams. By the end of intermediate Spanish (212), 85%of students will perform at the Novice High orIntermediate Low level. After completing both Spanish311 and 312 (the courses may be taken in any order),85% of students perform at the intermediate low levelor higher.

Implementation Plan (timeline): Students areadministered oral exams by their instructors at the endof Spanish 101, 102, 211. At the end of 212, 311, and312, students will receive an SOPI.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Dept. Head

Supporting Attachments:Oral interview Spanish 101 (Microsoft Word)Here are the guidelines for the oral interview along with arubric.

Findings for End of Course Oral Exams or SOPI -Spanish

Summary of Findings: There were no SOPI findings forSpanish 212 and 312, because of new instructors with alack of training. However, end of course oral examsindicate that at least 85% of students are scored 70%or better on end of course exams in Spanish 101, 102,211, 212, 311, and 312 courses.

Target Achievement: Met

Recommendations : Training for Spanish 211, 212, 311,312 instructors.

Notes :

Substantiating Evidence:

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Findings for I Can survey

Summary of Findings: Students felt they were about55% proficient at this by the end of the second year.

Target Achievement: Not Met

Recommendations : Continue to build this skill.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

12 of 48 8/6/2010 12:09 PM

Page 42: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage sequence.

I Can survey (Microsoft Word)Results of student survey 2008-2009.

SOPI Test (Simulated Oral ProficiencyInterview) - FrenchCourse level; Direct - Exam

Details/Description: This test is used by governmentagencies and by the American Council on the Teaching ofForeign Languages (ACTFL) to assess speakingproficiency in a target language. The test is recordedand assessed by instructors who have been trained inevaluation by this method.

It assesses speaking ability from Novice throughSuperior. Descriptions of the various levels are attached.

Target: By the end of the intermediate conversationcourse (FREN/SPAN 212) students should attain a NoviceHigh-Intermediate Low rating.

By the end of the composition-conversation course(FREN 312/SPAN 312) students should attainIntermediate Low - High rating.

Implementation Plan (timeline): The test isadministered at the conclusion of FREN or SPAN 212 and312 respectively.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:SOPI Proficiency Ratings from ACTFL (Adobe Acrobat

Document)

Findings for SOPI Test (Simulated Oral ProficiencyInterview) - French

Summary of Findings: At the end of the second year inFrench, 7% of students scored at the Novice Mid level,14% at the Intermediate Low level, 29% IntermediateMid and 50% at Intermediate High.

Therefore, 93% met our goal of NoviceHigh-Intermediate Low at the conclusion of thesecond-year French course (French 212).

At the end of the French 312 course (Composition andConversation), 14% scored at the Intermediate Lowlevel, 57% at the Intermediate Mid level, 14% atIntermediate High, and 14% at the Advanced level.

Therefore 100% met our standard.

Target Achievement: Exceeded

Recommendations : We recommend that the Outcomelevel for second-year French should be raised toperformance at the Intermediate Low level.

We recommend that the Outcome level for third-yearFrench be raised to the Intermediate Mid level.

Notes :

Substantiating Evidence:

II.1 LISTENING

Mapped to:National Standards in Foreign LanguageEducation: Standard 1.1: Conversations

Measures & Findings

Auralog pre and post-test - FrenchCourse level; Direct - Exam

Details/Description: TellMeMore is a software programwith built-in voice recognition that measures oral andaural comprehension as well as grammar, culture, andreading skills. This is produced by the company Auralog.

Target: Incoming first-year students are expected toscore between 1.0 and 1.6 on the 10-point scale. At theend of the first year they should score between 2.0-2.6.Incoming second-year students are expected to performbetween 2.0-2.6. At the end of the second year theyshould score between 3.0-3.6.

Implementation Plan (timeline): This was administered

Findings for Auralog pre and post-test - French

Summary of Findings: The findings at this introductorylevel are as expected. The lowest scores coming inspent less time working in the software but, due toclassroom instruction, still made the greatest gains.Those with the highest incoming scores yielded thelowest gains because language acquisition slows as thematerial gets more difficult. The two groups in themiddle also performed as expected in that the groupthat consistently worked harder, as measured by time inthe software, made greater strides than the group thatworked less. Overall, increases in gain diminished asscores increased, reflecting the increasing difficulty of

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

13 of 48 8/6/2010 12:09 PM

Page 43: Completed Academic Program Assessment Plans 20082009

at the beginning of the first and second-year coursesand at the end of the first and second-year sequences.

Key/Responsible Personnel: Dr. Felicia B. SturzerActing Dept. Head

Supporting Attachments:Auralog scores correlated to EU standards (Microsoft

Word)

the material.As an absolute measure of progress, students met ourexpectations.At the second year level, lower performing studentsspent substantially more time in the software and mademore substantial improvement. It must be noted thatthose scoring a 1, simply did not engage with theinstrument, so improvements at the lowest level aresuspect and those with 1s should have been discountedin the measures.

Target Achievement: Met

Recommendations : Since there is no remarkabledifference in gain between groups that worked more, asmeasured in time, from those who worked less, weconclude that the software provides little added valuewhen measured against time on task.At the intermediate level, supplemental work did seemto yield better results.

Notes :

Substantiating Evidence:Pre and Post test in Auralog for FREN 211-212 (Word

Document (Open XML))Pre and Post test results in Auralog FREN

101-102 (Word Document (Open XML))

Exam - Listening Component SpanishCourse level; Direct - Exam

Details/Description: Each exam of Spanish 101, 102,211, and 212, including the final exam has a listeningcomprehension component.

Target: 85% of students will complete 70% of thesections correctly.

Implementation Plan (timeline): Each exam, includingthe final exam, of Spanish 101, 102, 211, and 212 has alistening comprehension component.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Dept. Head

Supporting Attachments:Final Exam Fall 2008.doc (Microsoft Word)

Spanish 102 Final Exam 2009a.doc (Microsoft Word)

Findings for Exam - Listening Component Spanish

Summary of Findings: In listening assessment, arandom sample of 3 sections of Spanish 101 and 102exams in 2008-09 indicated that 67% of Spanish 101and 102 met the standard of 70% correct. However, 80%of students on the 101 exams met the standard of 70%correct. Poor scores on the 102 exam were probably theresult of combining listening comprehension with writingskills. In Spanish 212 listening comprehension scores asmeasured by Auralog in 2008 and 2009 indicate that85% and 83% respectively achieved the level ofintermediate or above, very near to department goals of85%. In Spanish 312 in 2008 and 2009, 100% studentsscored intermediate or better, with 80% and 67%respectively scoring intermediate high or advanced.

Target Achievement: Not Met

Recommendations : Test listening comprehension onfirst and second year courses separately withoutcombining with writing.

Notes :

Substantiating Evidence:

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first and

Findings for I Can survey

Summary of Findings: Whether listening to news orconversing with a native, students felt that they were

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

14 of 48 8/6/2010 12:09 PM

Page 44: Completed Academic Program Assessment Plans 20082009

second year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage sequence.

25% proficient at this goal.

Target Achievement: Not Met

Recommendations : Continue to build this skill.

Notes :

Substantiating Evidence:I Can survey (Microsoft Word)Results of student survey 2008-2009.

Immersion Instruction in SpanishCourse level; Indirect - Other

Details/Description: Spanish 211 and 212 are taughtprimarily in Spanish, with the occasional use of Englishfor more complex grammar points. 300/400-level Spanishcourses are taught entirely in Spanish.

Target: Spanish 211/212, 311, 312, 321, 322, 323, 325,400-level courses.

Implementation Plan (timeline): In Spanish 211/212most instruction is in Spanish. In 300-400 classes, allinstruction is in Spanish.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Dept. Head

Supporting Attachments:

Findings for Immersion Instruction in Spanish

Summary of Findings: There is no exam for this;however this practice is in place.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Listening Comprehension ExercisesCourse level; Direct - Student Artifact

Details/Description: Students complete on-lineworkbook which accompanies textbooks. Students in 101and 102 use the Quia program for Puntos en Breve,while those in 211 and 212 used a hardcoverworkbook/lab manual with Entre Nosotros.

Target: 85% of Students will successfully complete 70%of listening activities.

Implementation Plan (timeline): All Spanish 101, 102,211, and 212 students complete listeningcomprehension exercises as part of their on-lineworkbook and lab manual.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:

Findings for Listening Comprehension Exercises

Summary of Findings: This area is not trackedseparately.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Spanish Film SeriesInstitution level; Indirect - Other

Details/Description: The Department sponsors aSpanish film series, which students and the communitycan attend.

Findings for Spanish Film Series

Summary of Findings: This area is not tracked.

Target Achievement: Met

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

15 of 48 8/6/2010 12:09 PM

Page 45: Completed Academic Program Assessment Plans 20082009

Target: The community, Spanish students, other UTCstudents.

Implementation Plan (timeline): During the Fall andSpring Semester the Department of Foreign Languagesand Literatures hosts the Film Series which is open tothe public. Typically 3-4 films are shown each semester.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Dept. Head

Supporting Attachments:Spanish Film series Fall 2008 (Microsoft Word)

Recommendations :

Notes :

Substantiating Evidence:

Video Comprehension Measure - FrenchCourse level; Direct - Other

Details/Description: Students screen videos in thetarget language and answer comprehension questions invarious formats.

Target: A minimum 75% comprehension rate.

Implementation Plan (timeline): This will beimplemented across our curriculum in 200, 300 and400-level courses.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:

Findings for Video Comprehension Measure -French

Summary of Findings: We have no findings to report atthis time because video comprehension was not trackedas a separate assessment tool for purposes of thismeasure for either the 200 or 300 level courses.

Target Achievement:

Recommendations : We will attempt to track videocomprehension scores separately for purposes ofassessment.

Notes : We are unable to judge at this time whether thetarget was achieved.

Substantiating Evidence:

II.2 LISTENING

Mapped to:National Standards in Foreign LanguageEducation: Standard 1.2: Written & Spoken

Measures & Findings

Auralog pre and post-test - FrenchCourse level; Direct - Exam

Details/Description: TellMeMore is a software programwith built-in voice recognition that measures oral andaural comprehension as well as grammar, culture, andreading skills. This is produced by the company Auralog.

Target: Incoming first-year students are expected toscore between 1.0 and 1.6 on the 10-point scale. At theend of the first year they should score between 2.0-2.6.Incoming second-year students are expected to performbetween 2.0-2.6. At the end of the second year theyshould score between 3.0-3.6.

Implementation Plan (timeline): This was administeredat the beginning of the first and second-year coursesand at the end of the first and second-year sequences.

Key/Responsible Personnel: Dr. Felicia B. SturzerActing Dept. Head

Supporting Attachments:Auralog scores correlated to EU standards (Microsoft

Word)

Findings for Auralog pre and post-test - French

Summary of Findings: The findings at this introductorylevel are as expected. The lowest scores coming inspent less time working in the software but, due toclassroom instruction, still made the greatest gains.Those with the highest incoming scores yielded thelowest gains because language acquisition slows as thematerial gets more difficult. The two groups in themiddle also performed as expected in that the groupthat consistently worked harder, as measured by time inthe software, made greater strides than the group thatworked less. Overall, increases in gain diminished asscores increased, reflecting the increasing difficulty ofthe material.As an absolute measure of progress, students met ourexpectations.At the second year level, lower performing studentsspent substantially more time in the software and mademore substantial improvement. It must be noted thatthose scoring a 1, simply did not engage with theinstrument, so improvements at the lowest level aresuspect and those with 1s should have been discounted

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

16 of 48 8/6/2010 12:09 PM

Page 46: Completed Academic Program Assessment Plans 20082009

in the measures.

Target Achievement: Met

Recommendations : Since there is no remarkabledifference in gain between groups that worked more, asmeasured in time, from those who worked less, weconclude that the software provides little added valuewhen measured against time on task.At the intermediate level, supplemental work did seemto yield better results.

Notes :

Substantiating Evidence:Pre and Post test in Auralog for FREN 211-212 (Word

Document (Open XML))Pre and Post test results in Auralog FREN

101-102 (Word Document (Open XML))

Exam - Listening Component SpanishCourse level; Direct - Exam

Details/Description: Each exam of Spanish 101, 102,211, and 212, including the final exam has a listeningcomprehension component.

Target: 80% of students will complete 70% of thesections correctly.

Implementation Plan (timeline): Each exam, includingthe final exam, of Spanish 101, 102, 211, and 212 has alistening comprehension component.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Dept. Head

Supporting Attachments:Final Exam Fall 2008.doc (Microsoft Word)

Spanish 102 Final Exam 2009a.doc (Microsoft Word)

Findings for Exam - Listening Component Spanish

Summary of Findings: In listening assessment, arandom sample of 3 sections of Spanish 101 and 102exams in 2008-09 indicated that 67% of Spanish 101and 102 met the standard of 70% correct. However,80% of students on the 101 exams met the standard of70% correct. Poor scores on the 102 exam were probablythe result of combining listening comprehension withwriting skills. In Spanish 212 listening comprehensionscores as measured by Auralog in 2008 and 2009indicate that 85% and 83% respectively achieved thelevel of intermediate or above, very near to departmentgoals of 85%. In Spanish 312 in 2008 and 2009, 100%students scored intermediate or better, with 80% and67% respectively scoring intermediate high or advanced.

Target Achievement: Not Met

Recommendations : Test listening comprehensionseparately on first and second year exams.

Notes :

Substantiating Evidence:

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Findings for I Can survey

Summary of Findings: 70% felt that they could answera direct question but only 20% felt that they couldfunction in class without English.

Target Achievement: Not Met

Recommendations : Continue to build these skills.

Notes :

Substantiating Evidence:I Can survey (Microsoft Word)Results of student survey 2008-2009.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

17 of 48 8/6/2010 12:09 PM

Page 47: Completed Academic Program Assessment Plans 20082009

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage sequence.

Listening Comprehension ExercisesCourse level; Direct - Student Artifact

Details/Description: Students complete on-lineworkbook which accompanies textbooks. Students in 101and 102 use the Quia program for Puntos en Breve,while those in 211 and 212 used a hardcoverworkbook/lab manual with Entre Nosotros.

Target: 85% of students will successfully complete 70%of listening activities.

Implementation Plan (timeline): All Spanish 101, 102,211, and 212 students complete listeningcomprehension exercises as part of their on-line or hardcopy workbook and lab manual.

Key/Responsible Personnel:

Supporting Attachments:

Findings for Listening Comprehension Exercises

Summary of Findings: This area is not trackedseparately (it is part of the workbook grade).

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Oral questions; aural comprehension tasks -FrenchCourse level; Direct - Other

Details/Description: Students will be asked to performa variety of tasks, including questions to which theyrespond orally or in writing, listening comprehensionexercises, and demonstrate ability to follow directions.

Target: A minimum of 75% of students are able tocomplete tasks successfully.

Implementation Plan (timeline): This will beimplemented across our curriculum in 200, 300 and 400level courses.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:Sample exam-French 311-Composition &

Conversation (Microsoft Word)This covers the first two chapters of the book. LesFrancais, which assesses oral/aural comprehension, readingcomprehension, cultural comparisons, and writing shortessays.

sample test FREN 101 (Microsoft Word)First page of test is exercises where students must listenand follow directions, either noting facts or followingdirections on a map.

Findings for Oral questions; aural comprehensiontasks - French

Summary of Findings: In both the 200 and 300 levelcourses, this task was repeatedly practiced andassessed in the form of homework and classroomexercises as well as incorporated into testing. However,no separate tracking occurred.

Target Achievement:

Recommendations : We will attempt to track thesetasks on a more consistent basis.

Notes : We are unable to assess these tasks at thistime.

Substantiating Evidence:

II.3 LISTENING

Mapped to:

Measures & Findings

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

18 of 48 8/6/2010 12:09 PM

Page 48: Completed Academic Program Assessment Plans 20082009

National Standards in Foreign LanguageEducation: Standard 1.2: Written & Spoken

Auralog pre and post-test - FrenchCourse level; Direct - Exam

Details/Description: TellMeMore is a software programwith built-in voice recognition that measures oral andaural comprehension as well as grammar, culture, andreading skills. This is produced by the company Auralog.

Target: Incoming first-year students are expected toscore between 1.0 and 1.6 on the 10-point scale. At theend of the first year they should score between 2.0-2.6.Incoming second-year students are expected to performbetween 2.0-2.6. At the end of the second year theyshould score between 3.0-3.6.

Implementation Plan (timeline): This was administeredat the beginning of the first and second-year coursesand at the end of the first and second-year sequences.

Key/Responsible Personnel: Dr. Felicia B. SturzerActing Dept. Head

Supporting Attachments:Auralog scores correlated to EU standards (Microsoft

Word)

Findings for Auralog pre and post-test - French

Summary of Findings: The findings at this introductorylevel are as expected. The lowest scores coming inspent less time working in the software but, due toclassroom instruction, still made the greatest gains.Those with the highest incoming scores yielded thelowest gains because language acquisition slows as thematerial gets more difficult. The two groups in themiddle also performed as expected in that the groupthat consistently worked harder, as measured by time inthe software, made greater strides than the group thatworked less. Overall, increases in gain diminished asscores increased, reflecting the increasing difficulty ofthe material.As an absolute measure of progress, students met ourexpectations.At the second year level, lower performing studentsspent substantially more time in the software and mademore substantial improvement. It must be noted thatthose scoring a 1, simply did not engage with theinstrument, so improvements at the lowest level aresuspect and those with 1s should have been discountedin the measures.

Target Achievement: Met

Recommendations : Since there is no remarkabledifference in gain between groups that worked more, asmeasured in time, from those who worked less, weconclude that the software provides little added valuewhen measured against time on task.At the intermediate level, supplemental work did seemto yield better results.

Notes :

Substantiating Evidence:Pre and Post test in Auralog for FREN 211-212 (Word

Document (Open XML))Pre and Post test results in Auralog FREN

101-102 (Word Document (Open XML))

Dictations and cloze exercises - FrenchCourse level; Direct - Other

Details/Description: Students are given dictations andlistening comprehension exercises that requirecompletion of vocabulary, grammar or phrases.

Target: A minimum of 75% of students will successfullycomplete.

Implementation Plan (timeline): This will beimplemented across our curriculum in our 200, 300 and400 level courses.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:

Findings for Dictations and cloze exercises -French

Summary of Findings: At the 300 level, based on smallsample of separate dictations, 71% percent performedat or above the expected level. In addition, every examincludes an oral comprehension and/or dictation whichwere not scored separately for purposes of thisassessment.

At the 200 level, no separate dictations were given.However, every exam included an oral comprehensionand/or dictation which were not scored separately forpurposes of this assessment.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

19 of 48 8/6/2010 12:09 PM

Page 49: Completed Academic Program Assessment Plans 20082009

sample exam FREN 401 Francophone Lit (MicrosoftWord)

sample exam FREN211 (Microsoft Word)First exercise, as all first exercises on exams in FREN 211and 212, is a dictation exercise. These tests follow up onpractice cloze exercises performed during the 3rd and 4thlesson of every chapter where students complete cloze ofrecorded dialogues.

Target Achievement: Not Met

Recommendations : We will begin to track oralcomponents separately for the purpose of outcomesassessment in both the 200 and 300 level courses.

Notes :

Substantiating Evidence:

Exam - Listening Component SpanishCourse level; Direct - Exam

Details/Description: Each exam of Spanish 101, 102,211, and 212, including the final exam, has a listeningcomprehension component.

Target: 80% of students will complete 70% of thesections correctly.

Implementation Plan (timeline): Each exam, includingthe final exam, of Spanish 101, 102, 211, and 212 has alistening comprehension component.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Dept. Head

Supporting Attachments:Exam Final Fall 08 Spanish 101 (Microsoft Word)

Final exam Spring 102 Spanish 102 (Microsoft Word)

Findings for Exam - Listening Component Spanish

Summary of Findings: In listening assessment, arandom sample of 3 sections of Spanish 101 and 102exams in 2008-09 indicated that 67% of Spanish 101and 102 met the standard of 70% correct. However,80% of students on the 101 exams met the standard of70% correct. Poor scores on the 102 exam were probablythe result of combining listening comprehension withwriting skills. In Spanish 212 listening comprehensionscores as measured by Auralog in 2008 and 2009indicate that 85% and 83% respectively achieved thelevel of intermediate or above, very near to departmentgoals of 85%. In Spanish 312 in 2008 and 2009, 100%students scored intermediate or better, with 80% and67% respectively scoring intermediate high or advanced.

Target Achievement: Not Met

Recommendations : Do not combine listeningcomprehension with writing.

Notes :

Substantiating Evidence:

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage sequence.

Findings for I Can survey

Summary of Findings: Only 16% of students felt theycould follow a foreign language film without subtitles.

Target Achievement:

Recommendations : Continue to build this skill.

Notes :

Substantiating Evidence:I Can survey (Microsoft Word)Results of student survey 2008-2009.

Listening Comprehension Exercises - SpanishCourse level; Direct - Student Artifact

Details/Description: Students complete on-line

Findings for Listening Comprehension Exercises -Spanish

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

20 of 48 8/6/2010 12:09 PM

Page 50: Completed Academic Program Assessment Plans 20082009

workbook which accompanies textbooks. Students in 101and 102 use the Quia program for Puntos en Breve,while those in 211 and 212 used a hardcoverworkbook/lab manual with Entre Nosotros..

Target: 85% of students will successfully complete 70%of listening activities.

Implementation Plan (timeline): All Spanish 101, 102,211, and 212 students complete listeningcomprehension exercises as part of their workbook andlab manual.

Key/Responsible Personnel:

Supporting Attachments:

Summary of Findings: This area is not trackedseparately from the workbook/lab manual grade.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Video Comprehension Measure - FrenchCourse level; Direct - Other

Details/Description: Students screen videos in thetarget language and answer comprehension questions invarious formats.

Target: A minimum 75% comprehension rate.

Implementation Plan (timeline): This will beimplemented across our curriculum in 200, 300 and400-level courses.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:Paris Video Questions-French 311-Composition and

Conversation (Microsoft Word)

No Findings Added to Video ComprehensionMeasure - French

II.4 LISTENING

Mapped to:National Standards in Foreign LanguageEducation: Standard 2.2: Products & Perspectives

Measures & Findings

Exam - Listening Component SpanishCourse level; Direct - Exam

Details/Description: Each exam of Spanish 101, 102,211, and 212, including the final exam has a listeningcomprehension component.

Target: 80% of students will complete 70% of thesections correctly.

Implementation Plan (timeline): Each exam, includingthe final exam, of Spanish 101, 102, 211, and 212 has alistening comprehension component.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Dept. Head

Supporting Attachments:Final Exam Fall 2008 Spanish 101 (Microsoft Word)

Final exam Spring 2009 Spanish 102 (Microsoft Word)

Findings for Exam - Listening Component Spanish

Summary of Findings: In listening assessment, arandom sample of 3 sections of Spanish 101 and 102exams in 2008-09 indicated that 67% of Spanish 101and 102 met the standard of 70% correct. However, 80%of students on the 101 exams met the standard of 70%correct. Poor scores on the 102 exam were probably theresult of combining listening comprehension with writingskills. In Spanish 212 listening comprehension scores asmeasured by Auralog in 2008 and 2009 indicate that85% and 83% respectively achieved the level ofintermediate or above, very near to department goals of85%. In Spanish 312 in 2008 and 2009, 100% studentsscored intermediate or better, with 80% and 67%respectively scoring intermediate high or advanced.

Target Achievement: Not Met

Recommendations : Do not combinen listeningcomprehension and writing components.

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

21 of 48 8/6/2010 12:09 PM

Page 51: Completed Academic Program Assessment Plans 20082009

Substantiating Evidence:

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage sequence.

Findings for I Can survey

Summary of Findings: Only 16% felt they could follow aforeign language film without subtitles.

Target Achievement: Met

Recommendations : Strong start for this level as it isnot mastered until a higher level.

Notes :

Substantiating Evidence:I Can survey (Microsoft Word)Results of student survey 2008-2009.

Listening Comprehension Exercises - SpanishCourse level; Direct - Student Artifact

Details/Description: Students complete on-lineworkbook which accompanies textbooks. Students in 101and 102 use the Quia program for Puntos en Breve,while those in 211 and 212 used a hard copyworkbook/lab manual with Entre Nosotros.

Target: 85% of students will successfully complete 70%of listening activities.

Implementation Plan (timeline): All Spanish 101, 102,211, and 212 students complete listeningcomprehension exercises as part of their workbook andlab manual.

Key/Responsible Personnel:

Supporting Attachments:

Findings for Listening Comprehension Exercises -Spanish

Summary of Findings: This area was not trackedseparately.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Spanish 499 Spanish Film ClassCourse level; Direct - Exam

Details/Description: Spanish 499 Film Class taughtSpanish film, literature, and culture relating to theSpanish Civil War.

Target: 85% of select Spanish majors and minors willreceive 80% or better on Final Exam.

Implementation Plan (timeline): Students who havecompleted Spanish 311, 312, 321, and 322 may take thiscourse to meet graduation requirements of two 400-levelcourses for the major, or 4 300-400 level courses for theminor.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Department Head

Supporting Attachments:

Findings for Spanish 499 Spanish Film Class

Summary of Findings: This measure did not tracklistening comprehension separately; however, more than85% of students received 80% or better or midterm andfinal exams.

Target Achievement: Exceeded

Recommendations :

Notes : Records maintained in gradebook.

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

22 of 48 8/6/2010 12:09 PM

Page 52: Completed Academic Program Assessment Plans 20082009

Examen Final 499 Summer 2009.doc (Microsoft Word)

Spanish 499 SU2099 Midterm Exam.doc (MicrosoftWord)

Televised reports - FrenchCourse level; Direct - Other

Details/Description: Students watch televised reportsin the target language and perform an assessment taskbased on the content.

Target: A minimum of 75% of students are able toperform at this level.

Implementation Plan (timeline): This measure will beimplemented only in the 300 and 400 level courses.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:

Findings for Televised reports - French

Summary of Findings: This task is not introduced untilthe 300 level courses. However, no separate trackingsystem was used to assess this skill. It wasincorporated into the course work and testing.

Target Achievement:

Recommendations : For the 300 level courses, we willattempt to more systematically track track this skill inthe future.

Notes : We were unable to track this skill.

Substantiating Evidence:

Video Comprehension Measure - FrenchCourse level; Direct - Other

Details/Description: Students screen videos in thetarget language and answer comprehension questions invarious formats.

Target: A minimum 75% comprehension rate.

Implementation Plan (timeline): This will beimplemented across our curriculum in 200, 300 and400-level courses.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:

Findings for Video Comprehension Measure -French

Summary of Findings: This task is not introduced untilthe 300 level courses. However, no separate trackingsystem was used to assess this skill. It wasincorporated into the course work and testing.

Target Achievement: Not Met

Recommendations : For the 300 level courses, we willattempt to more systematically track track this skill inthe future.

Notes : We were unable to track this skill.

Substantiating Evidence:

III.1 READING

Mapped to:National Standards in Foreign LanguageEducation: Standard 1.2: Written & Spoken

Measures & Findings

Auralog pre and post-test - FrenchCourse level; Direct - Exam

Details/Description: TellMeMore is a software programwith built-in voice recognition that measures oral andaural comprehension as well as grammar, culture, andreading skills. This is produced by the company Auralog.

Target: Incoming first-year students are expected toscore between 1.0 and 1.6 on the 10-point scale. At theend of the first year they should score between 2.0-2.6.Incoming second-year students are expected to performbetween 2.0-2.6. At the end of the second year theyshould score between 3.0-3.6.

Implementation Plan (timeline): This was administeredat the beginning of the first and second-year coursesand at the end of the first and second-year sequences.

Findings for Auralog pre and post-test - French

Summary of Findings: The findings at this introductorylevel are as expected. The lowest scores coming inspent less time working in the software but, due toclassroom instruction, still made the greatest gains.Those with the highest incoming scores yielded thelowest gains because language acquisition slows as thematerial gets more difficult. The two groups in themiddle also performed as expected in that the groupthat consistently worked harder, as measured by time inthe software, made greater strides than the group thatworked less. Overall, increases in gain diminished asscores increased, reflecting the increasing difficulty ofthe material.As an absolute measure of progress, students met our

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

23 of 48 8/6/2010 12:09 PM

Page 53: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: Dr. Felicia B. SturzerActing Dept. Head

Supporting Attachments:Auralog scores correlated to EU standards (Microsoft

Word)

expectations.At the second year level, lower performing studentsspent substantially more time in the software and mademore substantial improvement. It must be noted thatthose scoring a 1, simply did not engage with theinstrument, so improvements at the lowest level aresuspect and those with 1s should have been discountedin the measures.

Target Achievement: Met

Recommendations : Since there is no remarkabledifference in gain between groups that worked more, asmeasured in time, from those who worked less, weconclude that the software provides little added valuewhen measured against time on task.At the intermediate level, supplemental work did seemto yield better results.

Notes :

Substantiating Evidence:Pre and Post test in Auralog for FREN 211-212 (Word

Document (Open XML))Pre and Post test results in Auralog FREN

101-102 (Word Document (Open XML))

Exam - Reading Comprehension Component -SpanishCourse level; Direct - Exam

Details/Description: Each Exam in Spanish 101, 102,211, and 212, includes a reading comprehension section.

Target: Students will successfully complete 70% ofreading comprehension section of exams.

Implementation Plan (timeline): Students in Spanish101, 102, 211, and 212 are administered a readingcomprehension section in each exam, including the finalexam.

Key/Responsible Personnel: Dr. Felicia B. SturzerActing Dept. Head

Supporting Attachments:Final exam Spanish 101 (Microsoft Word)

Final exam Spring 2009 Spanish 102 (Microsoft Word)

Findings for Exam - Reading ComprehensionComponent - Spanish

Summary of Findings: Outcomes for reading in Spanish212 and 312 were identical to those for listening levels,achieving departmental goals for each level. Readingscores from a random sample of 101 and 102 finalexams form 2008-2009 resulted in 70% of studentsscored 70% or better, indicative of a need for someimprovement. However, in general, elementaryinstructors felt that students scored well on readingcomprehension sections of exams, although separatereading scores were not maintained.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Findings for I Can survey

Summary of Findings: 76% felt they could understand amagazine or newspaper story.

Target Achievement: Met

Recommendations : Continue to build this skill.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

24 of 48 8/6/2010 12:09 PM

Page 54: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage sequence.

I Can survey (Microsoft Word)Results of student survey 2008-2009.

Reading comprehension - FrenchCourse level; Direct - Other

Details/Description: Students are exposed to a varietyof written realia in the target language, consisting ofarticles, web sites, newspapers, brochures, menus, etc.and asked to demonstrate comprehension through avariety of measures.

Target: A minimum of 75% of students are able toperform this task at their level.

Implementation Plan (timeline): This measure isimplemented from the elementary to the advanced levelin the target language.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:Sample article for French 312-Conversation and

Comprehension (Adobe Acrobat Document)The article is taken from a French newspaper. Studentsread articles for comprehension and report to the class inboth oral and written form.

vocabulary exercise FREN 323 French Culture &Civilzation (Microsoft Word)

Students follow links to read websites published by Frenchmuseums on sites and artifacts from prehistoric to moderntimes for class discussion. Short vocabulary quizzes testcomprehension prior to discussion in class.

Findings for Reading comprehension - French

Summary of Findings: At the 200 level, studentsresearched and debated contemporary issues in Frenchand 100% scored at the expected level of 75%comprehension rate.

At the 300 level, students researched articles oncontemporary issues and 100% demonstrated 85%comprehension rate. In addition, students presentedsummaries of readings on French culture and 100%demonstrated 85% comprehension rate.

Target Achievement: Exceeded

Recommendations : We will track targeted skills moresystematically.

Notes :

Substantiating Evidence:

III.2 READING

Mapped to:National Standards in Foreign LanguageEducation: Standard 3.2: Distinctive Viewpoints

Measures & Findings

Textual Analysis - FrenchCourse level; Direct - Student Artifact

Details/Description: Students demonstrate nuancedcomprehension of literary and non-literary texts in avariety of genres.

Target: A minimum of 85% of students can perform atthis level.

Implementation Plan (timeline): These skills areintroduced at the 200 level and systematically assessedat the 300 and 400 levels.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:Sample exam FREN 332 Introduction to Literature

Findings for Textual Analysis - French

Summary of Findings: In the 300 level in two separateliterature courses, 72% of students demonstrated an85% comprehension rate of the nuances in fictional andnon-fictional works.

Target Achievement: Not Met

Recommendations : We will work to improve student'sexpression of their comprehension rate since we believetheir comprehension exceeded their ability to expresstheir comprehension in the target language.

Notes : Language learning is inclusive of all skills thatare assessed. Therefore, it is difficult to separate theskills in any one assessment.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

25 of 48 8/6/2010 12:09 PM

Page 55: Completed Academic Program Assessment Plans 20082009

1800-present (Microsoft Word)Students must demonstrate their analytical skills inresponding to synthetic, genre based and thematicquestions on literary texts under study.

Substantiating Evidence:

Textual Analysis - SpanishCourse level; Direct - Exam

Details/Description: Students in literature classes(Spanish 331, 332, 400-level) will read and analyzeliterature.

Target: 85% of students in upperdivision literature andcultural classes will acheive 70% or better on examsrelating to textual analysis.

Implementation Plan (timeline): Upper division classes(with the exception of 322 and 323) will deal withtextual analysis in some regard.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Department Head

Supporting Attachments:Spanish 499 Final (Microsoft Word)

Spanish 499 Mid Term (Microsoft Word)

Findings for Textual Analysis - Spanish

Summary of Findings: More than 85% of studentsreceived grades of 80% or better on midterm and finalexams.

Target Achievement: Exceeded

Recommendations :

Notes : Grades are maintained in gradebook.

Substantiating Evidence:

III.3 READING

Mapped to:National Standards in Foreign LanguageEducation: Standard 3.1: Knowledge of otherdisciplines

Measures & Findings

300 and 400 level Culture, Civilization, andLiterature ClassesCourse level; Direct - Exam

Details/Description: Spanish students will be texted onmaterials they have read by midterm and final exams.

Target: 85% of Spanish majors and minors will score70% or better on midterm and final exams.

Implementation Plan (timeline): 300 and 400 levelliterature and civiliazation and culture classes.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Department Head

Supporting Attachments:Chapter of Culture and Civilization Course and

Questions (Adobe Acrobat Document)Latin American Culture and Civilization Mid

Term (Microsoft Word)Spanish 499 Final Exam (Microsoft Word)

Findings for 300 and 400 level Culture,Civilization, and Literature Classes

Summary of Findings: More than 85% of studentsachieved grades of 80% or better on midterm and finalexams.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Technical/Professional Textual Analysis -FrenchCourse level; Direct - Student Artifact

Details/Description: Students must demonstratecomprehension of technical and complex texts in a

Findings for Technical/Professional TextualAnalysis - French

Summary of Findings: This skill was assessed in twoseparate 300 level classes. In the literature class, 72%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

26 of 48 8/6/2010 12:09 PM

Page 56: Completed Academic Program Assessment Plans 20082009

variety of genres from a variety of sources.

Target: A minimum of 85% of students are able toperform this task.

Implementation Plan (timeline): This task is introducedand reinforced in 300 level courses and mastered in 400level courses.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:Paris, ville antique (Web Link)Sample link which forms the "textbook" for FREN 323Introduction to French Culture and Civilizaationhttp://www.paris.culture.fr/

understood complex literary analysis at an 85%comprehension rate. In the composition andconversation class, 100% of students understoodcomplex articles at an 85% level. In aggregate, we metour goal of 85% comprehension goal.

Target Achievement: Met

Recommendations : Sufficient data exists to assessthis skill.

Notes :

Substantiating Evidence:

IV.1 WRITING

Mapped to:National Standards in Foreign LanguageEducation: Standard 5.1: Within & BeyondClassroom

Measures & Findings

Exams Spanish 101, 102, 211, 212, 311, 312Course level; Direct - Exam

Details/Description: Every exam of Spanish 101-212contains an open-ended writing assessment. Spanish311 and 312 are composition and conversation courses.

Target: 85% of students at all levels of 101, 102, 211,212, 311, and 312 will receive 70% or better on writingportion of their exams.

Implementation Plan (timeline): Spanish 101, 102,211, 212, 311, 312

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Department Head

Supporting Attachments:Exam 1 Spanish 101 (Microsoft Word)

Exam 1 Spanish 211 (Microsoft Word)

Spanish 312 In-Class Essay (Microsoft Word)

Findings for Exams Spanish 101, 102, 211, 212,311, 312

Summary of Findings: This item is not trackedseparately in most cases. However, compositions ofSpanish 311 and 312 students are maintained in thedepartment office, which indicate that more than 85% ofstudents achieve scores of 80% or better on writtencompositions.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:Writing Outcome 312, 499 (Microsoft Word)

Writing Samples for advanced courses -FrenchCourse level; Direct - Student Artifact

Details/Description: These samples demonstratewriting outcomes required for advanced level languagecourses (300-400 level). Students are allowed to correctand revise writing samples after input from peers andinstructors.

Target: At the 300 level, students should be able tomeet the writing outcomes as indicated in this documenta level understood by a sympathetic native speaker. Atthe 400 level, they should be able to meet the writingoutcomes at an advanced level as understood by anynative speaker.

Implementation Plan (timeline): Samples are gathered

Findings for Writing Samples for advanced courses- French

Summary of Findings: In the 300 level composition andconversation class, 100% of students met the goal of85% success in conveying personal information.

Target Achievement: Met

Recommendations : The above statistics representessays that students have had an opportunity to correct.From a pedagogical standpoint, this reflects standardediting practices of any writing situation.

While writing skills are introduced in the secondsemester of the elementary sequence, it is not practicedin the 200 level courses. However, the latter stress

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

27 of 48 8/6/2010 12:09 PM

Page 57: Completed Academic Program Assessment Plans 20082009

throughout the year for all courses that are taughtduring that academic year.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.Professor of FrenchActing Dept. Head

Supporting Attachments:French 312-Syllabus for Composition and

Conversation- Essay writing (Microsoft Word)

conversation rather than writing skills.

Notes : Samples are on file in the Foreign Languageoffice in Brock 208.

Substantiating Evidence:

IV.2 WRITING

Mapped to:National Standards in Foreign LanguageEducation: Standard 3.2: Distinctive Viewpoints

Measures & Findings

Research Papers in Literature Classes beyond332Course level; Direct - Student Artifact

Details/Description: Students in 300 and 400-levelclasses beyond 332 are routinely asked to completeresearch papers in Spanish according to the MLA style.

Target: 85% of students will receive 70% or better.

Implementation Plan (timeline): Students in coursesabove Spanish 332.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Dept. Head

Supporting Attachments:Sample Syllabus (Microsoft Word)

Findings for Research Papers in Literature Classesbeyond 332

Summary of Findings: More than 85% of studentsachieve grade of 80% or better on final research papers.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:499 Syllabus Summer 2008.doc (Microsoft Word)

Writing Outcomes for 499 and 312.doc (MicrosoftWord)

Writing Samples for advanced courses-FrenchCourse level; Direct - Student Artifact

Details/Description: These samples demonstratewriting outcomes required for advanced level languagecourses (300-400 level). Students are allowed to correctand revise writing samples after input from peers andinstructors.

Target: At the 300 level, students should be able tomeet the writing outcomes as indicated in this documenta level understood by a sympathetic native speaker. Atthe 400 level, they should be able to meet the writingoutcomes at an advanced level as understood by anynative speaker.

Implementation Plan (timeline): Samples are gatheredthroughout the year for all courses that are taughtduring that academic year.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.Professor of FrenchActing Dept. Head

Supporting Attachments:Syllabus FREN 332 (Microsoft Word)The portion on "les comptes" signals students to research,read, summarize and integrate materials from articles from

Findings for Writing Samples for advancedcourses-French

Summary of Findings: In a 300 level literature class,71% of students achieved a 75% ability to compose aresearch paper integrating primary and secondarysources. In a 400 level literature class, 100%demonstrated this ability.

Target Achievement: Met

Recommendations : It is evident that while studentshave difficulty with this skill initially, ultimately theymaster it.

Notes : Student samples are on file in the ForeignLanguage office in Brock Hall 208.

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

28 of 48 8/6/2010 12:09 PM

Page 58: Completed Academic Program Assessment Plans 20082009

reputable journals in French on the topic of their choice fortheir paper. Students attend a short seminar with aresearch librarian to learn how to use the MLA databaseand retrieve articles from those citations before beginningtheir research.

syllabus FREN 413 (Microsoft Word)Syllabus outlines that must write 3 shorter (1000 word)research papers in French on the books read and discussedin class. Each paper must make use of at least 2 reputablejournal articles.

IV.3 WRITING

Mapped to:National Standards in Foreign LanguageEducation: Standard 2.1: Practices & Perspectives

Measures & Findings

300- and 400-level Spanish Classes (except321, 322)Course level; Direct - Student Artifact

Details/Description: Students write papers or answerquestions on exams that relate directly to literary,cultural, personal, and socio-political topics.

Target: 85% of students will score 70% or better onpapers and exams.

Implementation Plan (timeline): Students in mostupper division classes (except for 321, 322)

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Department Head

Supporting Attachments:Latin American Culture Exam (Microsoft Word)

Spanish 312 Final Exam (Microsoft Word)

Spanish 499 Exam (Microsoft Word)

Findings for 300- and 400-level Spanish Classes(except 321, 322)

Summary of Findings: Over 85% of students receivedgrades of 80% or better on written papers and exams.

Target Achievement: Exceeded

Recommendations :

Notes : Exam grades are maintained in instrucors'gradebooks.

Substantiating Evidence:Writing Outcomes for 499 and 312.doc (Microsoft

Word)

Writing Samples for advanced courses-FrenchCourse level; Direct - Student Artifact

Details/Description: These samples demonstratewriting outcomes required for advanced level languagecourses (300-400 level). Students are allowed to correctand revise writing samples after input from peers andinstructors.

Target: At the 300 level, students should be able tomeet the writing outcomes as indicated in this documenta level understood by a sympathetic native speaker. Atthe 400 level, they should be able to meet the writingoutcomes at an advanced level as understood by anynative speaker.

Implementation Plan (timeline): Samples are gatheredthroughout the year for all courses that are taughtduring that academic year.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.Professor of FrenchActing Dept. Head

Findings for Writing Samples for advancedcourses-French

Summary of Findings: At the 300 level, all of our examsincorporate this assessment skill. 85% of students intwo different classes demonstrated a 75% success rateat responding to complex questions.

The literature classes focus on assessing literary/culturalexpression, while composition/conversation classesfocus on assessing personal and socio-political topics.

Target Achievement: Met

Recommendations :

Notes : Student samples are on file in the ForeignLanguage office in Brock Hall 208.

Substantiating Evidence:French 311 Examen (Microsoft Word)

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

29 of 48 8/6/2010 12:09 PM

Page 59: Completed Academic Program Assessment Plans 20082009

Supporting Attachments:French 312-Final Exam (Composition) (Microsoft

Word)Sample exam FREN 332 Introduction to Literature

1800-present (Microsoft Word)As in all exams in this course, students must link textualexamples to themes and genre practices, contextualizingthem within the movement and period that they represent

This test is for the French 311 Composition andConversation Test, which includes oral and written sections.The topics are on comparative analysis between Frenchand American culture.

French 331 Examen (Microsoft Word)This is a test for French 331, the Intro. to French lit.course-Medieval period -18th century. It includes essaysand identification questions.

IV.4 WRITING

Mapped to:National Standards in Foreign LanguageEducation: Standard 1.2: Written & Spoken

Measures & Findings

Spanish Upper Division ClassesCourse level; Direct - Student Artifact

Details/Description: Students narrate a story.

Target: 85% of students can relate or summarize a realor fictional story.

Implementation Plan (timeline): Most upper divisionclasses.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Head

Supporting Attachments:Spanish 312 Final (Microsoft Word)

No Findings Added to Spanish Upper DivisionClasses

Writing Samples for advanced courses-FrenchCourse level; Direct - Student Artifact

Details/Description: These samples demonstratewriting outcomes required for advanced level languagecourses (300-400 level). Students are allowed to correctand revise writing samples after input from peers andinstructors.

Target: At the 300 level, students should be able tomeet the writing outcomes as indicated in thisdocument a level understood by a sympathetic nativespeaker. At the 400 level, they should be able to meetthe writing outcomes at an advanced level asunderstood by any native speaker.

Implementation Plan (timeline): Samples are gatheredthroughout the year for all courses that are taughtduring that academic year.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.Professor of FrenchActing Dept. Head

Supporting Attachments:La premiere France (Microsoft Word)On this exam, students must follow the course of Frenchculture and civilization from pre-historic France to theCarolingian age, following a particular theme of theirchoosing.

Findings for Writing Samples for advancedcourses-French

Summary of Findings: This assessment measure isintroduced in the second semester of our 200 levelcourse and reinforced in the 300 levelcomposition/conversation course. By the 400 levelcourse, we take this skill for granted.

In the 300 level composition/conversation course, 100%of the students performed at an 85% success rate.

Target Achievement: Met

Recommendations :

Notes : Student samples are available in the ForeignLanguage Office in Brock 208.

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

30 of 48 8/6/2010 12:09 PM

Page 60: Completed Academic Program Assessment Plans 20082009

Sample exam FREN 211 Intermediate French forConversation (Microsoft Word)

In this exam students must recount a story in dialogueform (asking questions to draw out the story and thenresponding with pieces of the events and theircircumstances).

V.1 CULTURE

Mapped to:National Standards in Foreign LanguageEducation: Standard 4.1: Nature of Language

Measures & Findings

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage sequence.

Findings for I Can survey

Summary of Findings: Overall results for this type ofknowledge student self-assessed at around 20-30%,depending upon phrasing.

Target Achievement: Met

Recommendations : Strong start for this level as it isnot mastered until a higher level.

Notes :

Substantiating Evidence:I Can survey (Microsoft Word)Results of student survey 2008-2009.

Interplay between Language and Culture-FrenchCourse level; Direct - Student Artifact

Details/Description: Students provide written and oralanalysis in a variety of forms of the interplay betweenthe target language and culture.

Target: A minimum of 85% of students should be ableto perform this task.

Implementation Plan (timeline): This task is introducedat the elementary level, reinforced at the 200 and 300levels, and mastered at the 400 level.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:Sample chapter quiz FREN 102 (Microsoft Word)Like most quizzes and exams at the introductory level,students must demonstrate their knowledge of grammarand vocabulary and tie it (here food and eating) to itsFrench cultural context, explaining the meaning of "lesplaisirs de la table."

Sample test French 312 Comp. andConversation (Adobe Acrobat Document)

This sample tests assesses student's understanding of the

Findings for Interplay between Language andCulture-French

Summary of Findings: At the 200 level, this outcomewas not disaggregated from class work and exam scores.

At the 300 level, the Composition and Conversationcourse (311-312) focuses on the relationship betweenlanguage and culture. We met our goal since at least85% of students passed the course with a B or better.

Target Achievement: Exceeded

Recommendations : In the future, we will disaggregatethis element from the other skills in the second year.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

31 of 48 8/6/2010 12:09 PM

Page 61: Completed Academic Program Assessment Plans 20082009

interplay between language and culture.

Spanish 211, 212, 323, 325 CultureComponentCourse level; Direct - Exam

Details/Description: Spanish and Latin AmericanCulture and Geography are tested on exams.

Target: 85% of students will score 70% or better.

Implementation Plan (timeline): Intermediate Spanishand Spanish 323 and 325.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Department Head

Supporting Attachments:Spanish 211 Sample Exam (Microsoft Word)

Spanish 325 Exam (Microsoft Word)

Findings for Spanish 211, 212, 323, 325 CultureComponent

Summary of Findings: In the category of culturalcompetency, students exceeded expectations, with100% and 92% of students passing final exams for theculture courses of Spanish 323 or 325 with a grade of70% or better.

Target Achievement: Exceeded

Recommendations :

Notes : Grades are maintained in instructors'gradebooks.

Substantiating Evidence:

V.2 CULTURE

Mapped to:National Standards in Foreign LanguageEducation: Standard 4.2: Concept of Culture

Measures & Findings

Comprehension of Cultural Determinants-FrenchCourse level; Direct - Student Artifact

Details/Description: Students analyze cultural markersin a variety of forms, recognizing historical and socio-political contingencies.

Target: A minimum of 85% of students are able toperform at this level.

Implementation Plan (timeline): This measure isimplemented at the elementary level and continues inthe 200, 300 and 400 level courses.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:Exam 2 from Introduction to France (Microsoft Word)This is the second exam of the Introduction to Francecourse, FREN 323, covering from the Normand invasions tothe Renaissance.

Findings for Comprehension of CulturalDeterminants-French

Summary of Findings: We did not assess this skill in2008-09 because the culture course was not taught.

Target Achievement:

Recommendations :

Notes :

Substantiating Evidence:

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.

Findings for I Can survey

Summary of Findings: AT the first and second year,students felt about 35% proficient at this skill.

Target Achievement: Met

Recommendations : Strong start for this level as it isnot mastered until a higher level.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

32 of 48 8/6/2010 12:09 PM

Page 62: Completed Academic Program Assessment Plans 20082009

Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage sequence.

I Can survey (Microsoft Word)Results of student survey 2008-2009.

Intermediate Spanish and Culture andCivilization CoursesCourse level; Direct - Exam

Details/Description: Students will understand andexpress the markers that distinguish a foreign Culture

Target: 85% of students will achieve scores of 70% orbetter on exams.

Implementation Plan (timeline): Intermediate andadvanced Spanish courses.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Department Head

Supporting Attachments:Sample 211 Exam (Microsoft Word)

Sample Latin American Culture and CivilizationExam (Microsoft Word)

Findings for Intermediate Spanish and Culture andCivilization Courses

Summary of Findings: Culture grades for 211 and 212are not tracked separately. In Spanish 323 and 325,students exceeded expectations, with 100% and 92% ofstudents passing final exams with 70% or better.

Target Achievement: Exceeded

Recommendations : Raise expectations from 85% with70% to 85% of students scoring 80% or better.

Notes :

Substantiating Evidence:

V.3 CULTURE

Mapped to:National Standards in Foreign LanguageEducation: Standard 2.2: Products & Perspectives

Measures & Findings

Auralog pre and post-testCourse level; Direct - Exam

Details/Description: TellMeMore is a software programwith built-in voice recognition that measures oral andaural comprehension as well as grammar, culture, andreading skills. This is produced by the company Auralog.

Target: Incoming first-year students are expected toscore between 1.0 and 1.6 on the 10-point scale. At theend of the first year they should score between 2.0-2.6.Incoming second-year students are expected to performbetween 2.0-2.6. At the end of the second year theyshould score between 3.0-3.6.

Implementation Plan (timeline): This was administeredat the beginning of the first and second-year coursesand at the end of the first and second-year sequences.

Key/Responsible Personnel: Dr. Felicia B. SturzerActing Dept. Head

Supporting Attachments:Auralog scores correlated to EU standards (Microsoft

Word)

Findings for Auralog pre and post-test

Summary of Findings: The findings at this introductorylevel are as expected. The lowest scores coming inspent less time working in the software but, due toclassroom instruction, still made the greatest gains.Those with the highest incoming scores yielded thelowest gains because language acquisition slows as thematerial gets more difficult. The two groups in themiddle also performed as expected in that the groupthat consistently worked harder, as measured by time inthe software, made greater strides than the group thatworked less. Overall, increases in gain diminished asscores increased, reflecting the increasing difficulty ofthe material.As an absolute measure of progress, students met ourexpectations.At the second year level, lower performing studentsspent substantially more time in the software and mademore substantial improvement. It must be noted thatthose scoring a 1, simply did not engage with theinstrument, so improvements at the lowest level aresuspect and those with 1s should have been discountedin the measures.

Target Achievement: Met

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

33 of 48 8/6/2010 12:09 PM

Page 63: Completed Academic Program Assessment Plans 20082009

Recommendations : Since there is no remarkabledifference in gain between groups that worked more, asmeasured in time, from those who worked less, weconclude that the software provides little added valuewhen measured against time on task.At the intermediate level, supplemental work did seemto yield better results.

Notes :

Substantiating Evidence:Pre and Post test in Auralog for FREN 211-212 (Word

Document (Open XML))Pre and Post test results in Auralog FREN

101-102 (Word Document (Open XML))

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage sequence.

Findings for I Can survey

Summary of Findings: Results ranged from proficient tonot proficient depending on whether students needed toidentify key figures or identify key moments of life cyclein target culture (34%).

Target Achievement: Met

Recommendations : Strong start for this level as it isnot mastered until a higher level.

Notes :

Substantiating Evidence:I Can survey (Microsoft Word)Results of student survey 2008-2009.

Identifying and ContextualizingManifestations of Culture - FrenchCourse level; Direct - Student Artifact

Details/Description: Students must identify, analyze,and compare various manifestations of culture indifferent formats and contexts.

Target: A minimum of 85% of students are able toperform this task.

Implementation Plan (timeline): This is implementedbeginning at the elementary level and continues in the200, 300, and 400 level courses.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:Exam from Introduction to Culture (Microsoft Word)For the early period of French culture, we examine avariety of French website, study their content and evaluatethe value and the values promoted in the way the Frenchunderstand and retell the story of their culture.

Findings for Identifying and ContextualizingManifestations of Culture - French

Summary of Findings: We could not assess this skillbecause we did not teach the culture course in thiscycle.

Target Achievement:

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

34 of 48 8/6/2010 12:09 PM

Page 64: Completed Academic Program Assessment Plans 20082009

Spanish and Latin American Culture andCivilization CoursesCourse level; Direct - Exam

Details/Description: All Spanish majors are required totake either a Spanish or Spanish American Culture Class.Spanish minors may also take the courses towardsgraduation requirements.

Target: 85% of students will score 70% on exams.

Implementation Plan (timeline): The 300 and 400 levelcurriculum.

Key/Responsible Personnel: Felicia B. Sturzer, Ph.D.,Acting Head

Supporting Attachments:Latin American Civilization and Culture (Microsoft

Word)Spanish 499 Alberti, Leon Class (Microsoft Word)

Findings for Spanish and Latin American Cultureand Civilization Courses

Summary of Findings: In the category of culturalcompetency, students exceeded expectations, with100% and 92% of students passing final exams for theculture courses of Spanish 323 or 325 with a grade of70% or better.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

V.4 CULTURE

Mapped to:National Standards in Foreign LanguageEducation: Standard 4.2: Concept of Culture

Measures & Findings

Cultural Relativity - Spanish and SpanishAmerican Culture CoursesCourse level; Direct - Exam

Details/Description: Students analyze and comparecultural manifestations in the target culture and relatethem to a broader cultural context.

Target: All Spanish Majors are required to take eitherSpanish 323 or 325, Spanish and Spanish AmericanCulture. In addition many minors may take this courseas part of their 18 hour requirement.

Implementation Plan (timeline): Junior or Senior year.

Key/Responsible Personnel: Dr. Felicia Sturzer, Dept.Head

Supporting Attachments:Art on Final Exam Spanish 325 (Microsoft PowerPoint)

Final Exam Spanish 325 Spring 2009 (Microsoft Word)

Midterm Exam Spanish 323 (Microsoft Word)

Findings for Cultural Relativity - Spanish andSpanish American Culture Courses

Summary of Findings: In the category of culturalcompetency, students exceeded expectations, with100% and 92% of students passing final exams for theculture courses of Spanish 323 or 325 with a grade of70% or better.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Cultural Relativity -FrenchCourse level; Direct - Other

Details/Description: Students analyze and comparecultural manifestations in the target culture and relatethem to a broader cultural context.

Target: A minimum of 85% of students are able toperform this task.

Implementation Plan (timeline): This is implementedbeginning in the elementary level and is reinforced in

Findings for Cultural Relativity -French

Summary of Findings: This skill was not assessedbecause the culture course was not taught during thiscycle.

Target Achievement:

Recommendations :

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

35 of 48 8/6/2010 12:09 PM

Page 65: Completed Academic Program Assessment Plans 20082009

the 200, 300 and mastered 400 level courses.

Key/Responsible Personnel: Dr. Felicia B. Sturzer

Supporting Attachments:Final Exam for Introduction to French

Culture (Microsoft Word)The final exam asks students to move back and forth fromcultural artifacts to the culture that produced them and tofollow the development of a single manifestation of cultureacross history, thereby articulating aspects that do and donot change across time, politics, history.

Substantiating Evidence:

I Can surveyOther level; Indirect - Survey

Details/Description: Students were asked what theycan do in the target language across the first andsecond year courses.

Target: Students will be 75% proficient at stated goal.

Implementation Plan (timeline): Throughout the2008-2009 year at the first and second year level, asadministered by students of the Psychology dept andsupervised by a professor in the Psychology Dept.

Key/Responsible Personnel: Felicia Sturzer, Ph.D.Acting Dept. Head

Supporting Attachments:I Can survey (Microsoft Word)Survey administered across the first and second yearlanguage sequence.

Findings for I Can survey

Summary of Findings: Depending on how this skill isphrased, student confidence in their proficiency varied.Personal comparison ranked consistently between 70and 80%, as well as using cultural knowledge tocomprehend a story which ended towards 80%.Butidentifying and explaining a cultural site earned on 40%and choosing a culturally appropriate gift earned 36%.

Target Achievement: Met

Recommendations : Strong start for this level as it isnot mastered until a higher level.

Notes :

Substantiating Evidence:I Can survey (Microsoft Word)Survey results for 2008-2009.

Foreign Languages: BA Greek

Outcomes

1. Grammar

Mapped to:Foreign Languages: BA Greek: Culture, Grammar,Grammar, Grammar, Grammar, Grammar, Grammar,Grammar, Translation, Translation, Translation,Translation

Measures & Findings

Exams and quizzesCourse level; Direct - Exam

Details/Description: Students in Greek 101 areintroduced to the three declensions of nouns andadjectives through oral instruction, written exercises,and translations involving Greek-English and English-Greek.

Target: 75% of students will be able to recognize,decline, and parse nouns and adjectives of the threedeclensions.

Implementation Plan (timeline): Implementation ofexams and quizzes over the course of the semester

Key/Responsible Personnel: John Phillips

Findings for Exams and quizzes

Summary of Findings: 58% percent of students in Greek101 successfully completed this outcome

Target Achievement: Not Met

Recommendations : Continue to emphasize this area ofgrammar.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

36 of 48 8/6/2010 12:09 PM

Page 66: Completed Academic Program Assessment Plans 20082009

Supporting Attachments:

Grammar

Mapped to:Foreign Languages: BA Greek: 1. Grammar

Measures & Findings

Exams and quizzesCourse level; Direct - Exam

Details/Description: Students in Greek 101 and 102 areintroduced to the present, imperfect, and aorist tensesof 0-type verbs and participles through oral instruction,written exercises, and translations involving Greek-English and English-Greek.

Target: 75% of students will be able to recognize,conjugate, decline and parse correctly the present tenseforms of 0-type verbs and of participles.

Implementation Plan (timeline): Implementation ofexams and quizzes throughout the semester.

Key/Responsible Personnel: John Phillips

Supporting Attachments:

Findings for Exams and quizzes

Summary of Findings: 80% of students in Greek 102successfully completed this outcome.

Target Achievement: Exceeded

Recommendations : Continue to emphasis this area ofgrammar

Notes :

Substantiating Evidence:

Grammar

Mapped to:No Mapping

Measures & Findings

Exams and quizzesCourse level; Direct - Exam

Details/Description: Students in Greek 102 areintroduced to the declension of the relative pronounsand the syntax of relative clauses through oralinstruction, written exercises, and translations involvingGreek-English and English-Greek.

Target: 75% of students will be able to recognize,decline,and parse relative pronouns, and recognize thesyntax of relative clauses.

Implementation Plan (timeline): Implementation ofexams and quizzes over the course of the semester.

Key/Responsible Personnel: John Phillips

Supporting Attachments:

Findings for Exams and quizzes

Summary of Findings: 80% of students in Greek 102successfully completed this outcome.

Target Achievement: Exceeded

Recommendations : Continue to emphasize this area ofgrammar.

Notes :

Substantiating Evidence:

Grammar

Mapped to:No Mapping

Measures & Findings

Exams and quizzesCourse level; Direct - Exam

Details/Description: Students in Greek 101 and 102 areintroduced to the conjugation of imperatives of thepresent and aorist tenses through oral instruction,written exercises, and translations involving Greek-English and English-Greek.

Target: 75% of students will be able to recognize and

Findings for Exams and quizzes

Summary of Findings: 80% of students in Greek 102successfully completed this outcome.

Target Achievement: Exceeded

Recommendations : Continue to emphasize this area ofgrammar.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

37 of 48 8/6/2010 12:09 PM

Page 67: Completed Academic Program Assessment Plans 20082009

parse imperatives of both tenses.

Implementation Plan (timeline): Implementation ofexams and quizzes over the course of the semester

Key/Responsible Personnel: John Phillips

Supporting Attachments:

Notes :

Substantiating Evidence:

Grammar

Mapped to:No Mapping

Measures & Findings

Exams and quizzesCourse level; Direct - Exam

Details/Description: Students in Greek 350 areintroduced to the subjunctive and optative moodsthrough oral instruction, written exercises, andtranslations involving Greek-English and English-Greek.

Target: 75% of students will be able to recognize,conjugate, and parse verbs of both moods.

Implementation Plan (timeline): Implementation ofexams and quizzes over the course of the semester

Key/Responsible Personnel: John Phillips

Supporting Attachments:

Findings for Exams and quizzes

Summary of Findings: 100% of students in Greek 350(intermediate level) successfully completed thisoutcome.

Target Achievement: Exceeded

Recommendations : Continue to emphasize this area ofgrammar

Notes :

Substantiating Evidence:

Grammar

Mapped to:No Mapping

Measures & Findings

Exams and quizzesCourse level; Direct - Exam

Details/Description: Students in Greek 350 areintroduced to the uses of the subjunctive and optativemoods in both independent and dependent clausesthrough oral instruction, written exercises, andtranslations involving Greek-English and English-Greek.

Target: 75% of students will be able to recognize andproperly translate the various uses of these moods.

Implementation Plan (timeline): Implementation ofexams and quizzes over the course of the semester

Key/Responsible Personnel: John Phillips

Supporting Attachments:

Findings for Exams and quizzes

Summary of Findings: 100% of students in Greek 350(intermediate level) successfully completed thisoutcome.

Target Achievement: Exceeded

Recommendations : Continue to emphasize this area ofgrammar.

Notes :

Substantiating Evidence:

Grammar

Mapped to:No Mapping

Measures & Findings

Exams and quizzesCourse level; Direct - Exam

Details/Description: Students in Greek 102 areintroduced to the comparative forms of adjectives andadverbs through oral instruction, written exercises, andtranslations involving Greek-English and English-Greek.

Findings for Exams and quizzes

Summary of Findings: 80% of students in Greek 102successfully completed this outcome.

Target Achievement: Exceeded

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

38 of 48 8/6/2010 12:09 PM

Page 68: Completed Academic Program Assessment Plans 20082009

Target: 75% of students will be able to recognize,decline, and parse the comparative and superlativeforms of adjectives and adverbs.

Implementation Plan (timeline): Implementation ofexams and quizzes over the course of the semester

Key/Responsible Personnel: John Phillips

Supporting Attachments:

Recommendations : Continue to emphasize this area ofgrammar.

Notes :

Substantiating Evidence:

Grammar

Mapped to:No Mapping

Measures & Findings

Exams and quizzesCourse level; Direct - Exam

Details/Description: Students in Greek 350 areintroduced to the perfect, future perfect, and pluperfecttenses of verbs of both types through oral instruction,written exercises, and translations involving Greek-English and English-Greek.

Target: 75% of students will be able to recognize,conjugate, and parse correctly the these three tenses ofverbs.

Implementation Plan (timeline): Implementation ofexams and quizzes throughout the semester.

Key/Responsible Personnel: John Phillips

Supporting Attachments:

Findings for Exams and quizzes

Summary of Findings: 100% of students in Greek 350(intermediate level) successfully completed thisoutcome.

Target Achievement: Exceeded

Recommendations : Continue to emphasize this area ofgrammar.

Notes :

Substantiating Evidence:

Translation

Mapped to:No Mapping

Measures & Findings

ExamsCourse level; Direct - Exam

Details/Description: Students in Greek 350 and 351 areassigned daily translation exercises from the texts ofprimary authors of prose and poetry.

Target: 75% of students will be able to translate thesepassages with 90% accuracy in their daily assignmentsand translation exams.

Implementation Plan (timeline): Implementation ofdaily translation assignments and translation examsover the course of the semester.

Key/Responsible Personnel: John Phillips

Supporting Attachments:

Findings for Exams

Summary of Findings: 100% of students in Greek 350(intermediate level) successfully completed thisoutcome.

Target Achievement: Exceeded

Recommendations : Continue to emphasize this area oftranslation skills.

Notes :

Substantiating Evidence:

Translation

Mapped to:No Mapping

Measures & Findings

Exams and translationsCourse level; Direct - Exam

Findings for Exams and translations

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

39 of 48 8/6/2010 12:09 PM

Page 69: Completed Academic Program Assessment Plans 20082009

Details/Description: Students in Greek 350 (advancedlevel) are assigned daily translation exercises from thetexts of primary authors of prose and poetry. Translationexercises and discussion of these exercises focus on theprose styles of the different authors.

Target: 75% of students will be able to appreciatedifferent prose styles in discussions of their dailyassignments and translation exams.

Implementation Plan (timeline): Implementation ofdaily translation assignments and translation examsover the course of the semester. Discussion of theseassignments and exams.

Key/Responsible Personnel: John Phillips

Supporting Attachments:

Summary of Findings: 100% of students in Greek 350(advanced level) successfully completed this outcome.

Target Achievement: Exceeded

Recommendations : Continue to emphasize this area oftranslation skills.

Notes :

Substantiating Evidence:

Translation

Mapped to:No Mapping

Measures & Findings

Exams and translationsCourse level; Direct - Exam

Details/Description: Students in Greek 351 areassigned daily translation exercises from the texts ofprimary authors of poetry. Translation exercises anddiscussion of these exercises focus on the genres ofpoetry.

Target: 75% of students will be able to appreciatedifferent poetic genres in discussions of their dailyassignments and translation exams.

Implementation Plan (timeline): Implementation ofdaily translation assignments and translation examsover the course of the semester. Discussion of theseassignments and exams.

Key/Responsible Personnel: John Phillips

Supporting Attachments:

Findings for Exams and translations

Summary of Findings: 100% of students in Greek 351successfully completed this outcome.

Target Achievement: Exceeded

Recommendations : Continue to emphasize this area oftranslation skills.

Notes :

Substantiating Evidence:

Translation

Mapped to:No Mapping

Measures & Findings

Exams and translationsCourse level; Direct - Exam

Details/Description: Students in Greek 350 (advancedlevel) and 351 are assigned daily translation exercisesfrom the texts of primary authors of prose and poetry.Translation exercises and discussion of these exercisesfocus on accuracy of translation and identification ofsyntactical structures.

Target: 75% of students will be able to translateaccurately and identify important syntactical structures90% of the time in their daily assignments andtranslation exams.

Findings for Exams and translations

Summary of Findings: 100% of students in Greek 351successfully completed this outcome.

Target Achievement: Exceeded

Recommendations : Continue to emphasize this area oftranslation skills.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

40 of 48 8/6/2010 12:09 PM

Page 70: Completed Academic Program Assessment Plans 20082009

Implementation Plan (timeline): Implementation ofdaily translation assignments and translation examsover the course of the semester.

Key/Responsible Personnel: John Phillips

Supporting Attachments:

Culture

Mapped to:No Mapping

Measures & Findings

Exams and translationsCourse level; Direct - Exam

Details/Description: Students in Greek 350 (advancedlevel) and 351 are assigned daily translation exercisesfrom the texts of primary authors of prose and poetry.Translation exercises and discussion of these exercisesfocus on the culture and history embodied in the texts.

Target: 75% of students will be able to appreciate anddiscuss intelligently the culture and history of ancientGreece as reflected in their daily assignments andtranslation exams.

Implementation Plan (timeline): Implementation ofdaily translation assignments and translation examsover the course of the semester. Discussion of theseassignments and exams.

Key/Responsible Personnel: John Phillips

Supporting Attachments:

Findings for Exams and translations

Summary of Findings: 100% of students in Greek 351successfully completed this outcome.

Target Achievement: Exceeded

Recommendations : Continue to emphasize this area ofcultural appreciation.

Notes :

Substantiating Evidence:

Foreign Languages: BA Latin

Outcomes

I.1 Grammar

Mapped to:No Mapping

Measures & Findings

Measure: In-house examsDetails/Description: It is expected that 75% ofstudents be able by the end of LAT 101 to identify allparts of the active indicative verbal system with anaccuracy of at least %75. The same proportion ofstudents should be able to produce these forms frommemory with an accuracy rate of at least 65%. At theend of LAT 102 75% of students should be able, with anaccuracy of %75, to identify all forms of the entireindicative verb system (including passives andparticiples). Again, an accuracy rate of 65% will beexpected for active recall of these same forms.

Target: As stated in Details section, %75 of studentsare expected to meet the standard appropriate to theirsemester level.

No Findings Added to Measure: In-house exams

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

41 of 48 8/6/2010 12:09 PM

Page 71: Completed Academic Program Assessment Plans 20082009

Implementation Plan (timeline): The skills are to beintroduced in LAT 101 and to be fully in place by the endof LAT 102. An assessment will be made in each ofthese two courses near the end of the semester.

Key/Responsible Personnel: Dr. Joshua Davies

Notes: Due to doubly unfortunate circumstances(crashed hard drive and boxes lost in move) findings arenot available for 2008-2009. Care has been taken formore secure preservation of data in future years.

Supporting Attachments:

1.2 Grammar

Mapped to:No Mapping

Measures & Findings

In-house examsDetails/Description: At the close of LAT 201 75% ofstudents should be able to identify and translate with75% accuracy all forms of the subjunctive verbal system,the syntactic constructions that are tied to thesubjunctive (e.g. purpose clauses, conditionals, wishes,etc.), and the ablative absolute. The same proportion ofstudents should be able to generate these forms andsimple examples of these structures with at least 65%accuracy. After completing LAT 202, 80% of studentsshould be able to perform all the tasks above at thesame levels of accuracy.

Target: %75 of students are expected to meet thestandard for LAT 201, %80 for LAT 202.

Implementation Plan (timeline): The skills are to beintroduced in LAT 201 and to be fully in place by the endof LAT 202. An assessment will be made in each ofthese two courses near the end of the semester.

Key/Responsible Personnel: Dr. Joshua Davies

Notes: Due to doubly unfortunate circumstances(crashed hard drive and boxes lost in move) findings arenot available for 2008-2009. Care has been taken formore secure preservation of data in future years.

Supporting Attachments:

No Findings Added to In-house exams

1.3 Grammar

Mapped to:No Mapping

Measures & Findings

In-house examsDetails/Description: It is expected that 75% ofstudents be able by the end of LAT 101 to identify allforms of the nouns in declensions 1-3 with an accuracyof at least %75; and they should be able to producethese same forms from memory with an accuracy rate ofat least 65%. At the end of LAT 102 75% of students

No Findings Added to In-house exams

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

42 of 48 8/6/2010 12:09 PM

Page 72: Completed Academic Program Assessment Plans 20082009

should be able, with an accuracy of %75, to identify allforms of all noun/adjective declensions and also themajor pronouns. An accuracy rate of 65% will beexpected for active recall of these same forms.

Target: %75 of students will meet the standard fortheir semester level.

Implementation Plan (timeline): The skills areintroduced in LAT 101 and mastered in LAT 102. Anassessment will be made in each of these two coursesnear the end of the semester.

Key/Responsible Personnel: Dr. Joshua Davies

Notes: Due to doubly unfortunate circumstances(crashed hard drive and boxes lost in move) findings arenot available for 2008-2009. Care has been taken formore secure preservation of data in future years.

Supporting Attachments:

1.4 Grammar

Mapped to:No Mapping

Measures & Findings

In-house examsDetails/Description: 75% of LAT 102 students will beexpected to comprehend indirect discourse and toproduce actively simple examples of this construction inwriting. 75% of LAT 201 students will be expected tocomprehend complex indirect discourse and producemoderately complex examples of it. This same outcomewill be achieved by %80 of students in LAT 202.

Target: %75 of LAT 102 and LAT 201 students will meetthe standard for their semester level. %80 percent ofLAT 202 students will meet the standard for their level.

Implementation Plan (timeline): The skills areintroduced in LAT 102, reinforced and developed in LAT201, fully mastered in LAT 202. An assessment will bemade in each of these two courses near the end of thesemester.

Key/Responsible Personnel: Dr. Joshua Davies

Notes: Due to doubly unfortunate circumstances(crashed hard drive and boxes lost in move) findings arenot available for 2008-2009. Care has been taken formore secure preservation of data in future years.

Supporting Attachments:

No Findings Added to In-house exams

II.1 Vocabulary

Mapped to:No Mapping

Measures & Findings

In-house examsDetails/Description: When confronted with a page ofLatin at their semester level, 75% students should be

No Findings Added to In-house exams

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

43 of 48 8/6/2010 12:09 PM

Page 73: Completed Academic Program Assessment Plans 20082009

able to translate 70% of the words without assistancefrom a dictionary.

Target: %75 of students will meet the standard.

Implementation Plan (timeline): The skills areintroduced in LAT 101 and reinforced thereafter.Targeted assessment happens through LAT 202 near theend of each semester.

Key/Responsible Personnel: Dr. Joshua Davies

Notes: Due to doubly unfortunate circumstances(crashed hard drive and boxes lost in move) findings arenot available for 2008-2009. Care has been taken formore secure preservation of data in future years.

Supporting Attachments:

II.2 Vocabulary

Mapped to:No Mapping

Measures & Findings

In-house activityDetails/Description: given a page of English prose, 75%of students should be able to identify a certainpercentage of the Latin derivates: 20% after LAT 101,40% after LAT 102, 50% after LAT 201, 60% after LAT202.

Target: %75 of students will meet the standard.

Implementation Plan (timeline): The skills areintroduced in LAT 101 and reinforced thereafter.Targeted assessment happens through LAT 202 near theend of each semester.

Key/Responsible Personnel: Dr. Joshua Davies

Notes: Due to doubly unfortunate circumstances(crashed hard drive and boxes lost in move) findings arenot available for 2008-2009. Care has been taken formore secure preservation of data in future years.

Supporting Attachments:

No Findings Added to In-house activity

III.1 Writing

Mapped to:No Mapping

Measures & Findings

In-house examDetails/Description: Students should be able tocompose fairly quickly simple and properly formedsentences that integrate the full grammatical knowledgeand vocabulary of their semester-level. This instructorwill assess them in each semester of basic Latin (LAT101&102) and intermediate Latin (LAT 201&202) bydistributing in class eight English sentences to berendered into Latin in fifteen minutes. 75% of studentswill be expected to perform at a “C” level or higher.

Target: %75 of students will meet the standard of their

No Findings Added to In-house exam

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

44 of 48 8/6/2010 12:09 PM

Page 74: Completed Academic Program Assessment Plans 20082009

semester-level.

Implementation Plan (timeline): The skills areintroduced in LAT 101, reinforced in LAT 102 and 201,mastered in LAT 202. Assessment will take place nearthe end of each of these courses.

Key/Responsible Personnel: Dr. Joshua Davies

Notes: Due to doubly unfortunate circumstances(crashed hard drive and boxes lost in move) findings arenot available for 2008-2009. Care has been taken formore secure preservation of data in future years.

Supporting Attachments:

III.2 Writing

Mapped to:No Mapping

Measures & Findings

In-house activityDetails/Description: Students will be expected tocompose creative compositions, in a Latin appropriate totheir semester level, either in the form of a dramaticdialogue or else in one of the ancient literary genres.They will do this as a group project once each semesterfrom LAT 101 through LAT 201. It is expected that %75percent of them will perform this task in a manner thatthe instructor considers worthy of the letter grade B orhigher. In LAT 202, students will produce individualcompositions in an ancient genre such as poetic epigramor tombstone dedication. 75% will be expected to earnat least a letter grade of C on this work.

Target: %75 of students will meet the standard of theirsemester-level.

Implementation Plan (timeline): The skills areintroduced in LAT 101 and reinforced through LAT 202.Assessment will take place near the end of eachsemester.

Key/Responsible Personnel: Dr. Joshua Davies

Notes: Due to doubly unfortunate circumstances(crashed hard drive and boxes lost in move) findings arenot available for 2008-2009. Care has been taken formore secure preservation of data in future years.

Supporting Attachments:

No Findings Added to In-house activity

IV.1 Speaking and AuralComprehension

Mapped to:No Mapping

Measures & Findings

In-house activityDetails/Description: In each semester, Latin studentsshould be able to understand as the instructor speaks inlevel-appropriate Latin and asks questions about thenarratives being studied or about various other topics. Inaddition, students should be able to formulate simple

No Findings Added to In-house activity

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

45 of 48 8/6/2010 12:09 PM

Page 75: Completed Academic Program Assessment Plans 20082009

answers in Latin to the questions asked; and theirability to do so should show some growth through thesequence of LAT 101, LAT 102, LAT 201, LAT 202. Whilethis kind of activity will be frequent during the semester,assessment will take place just once near the end of theterm. The instructor will discuss a subject in Latin andpose a question to each student, noting how many areable to answer in simple but well formulated Latin. It isexpected that at least 70% will perform this taskacceptably.

Target: %70 of students will meet the standard of theirsemester-level.

Implementation Plan (timeline): The skills areintroduced in LAT 101 and reinforced through LAT 202.Assessment will take place near the end of eachsemester.

Key/Responsible Personnel: Dr. Joshua Davies

Notes: Due to doubly unfortunate circumstances(crashed hard drive and boxes lost in move) findings arenot available for 2008-2009. Care has been taken formore secure preservation of data in future years.

Supporting Attachments:

V.1 Reading

Mapped to:No Mapping

Measures & Findings

In-house activityDetails/Description: The ability to read and understandLatin texts is the central linguistic competency of themajor, the one to which all the others lead andcontribute. Each semester the instructor will make anassessment of reading by asking the students totranslate, with the aid of a dictionary, a passage thatthey have never seen before. It is expected that 75% ofthem will perform this task at a level that the instructordeems deserving of a letter grade of C or better. For LAT101, LAT 102, and LAT 201, the passage will besimplified Latin taken from a textbook aimed at theirlevel. For LAT 202, they will receive an unmodified, butclear and straightforward passage from one of Romanauthors. In LAT 350&351 they will receive an unmodifiedbut more difficult and complex passage from one of theRoman authors.

Target: %75 of students will meet the standard of theirsemester-level.

Implementation Plan (timeline): The skills areintroduced in LAT 101 and reinforced thereafter, reachinga relative mastery at the 300-level. Targetedassessment happens in every course near the end ofeach semester.

Key/Responsible Personnel: Dr. Joshua Davies

No Findings Added to In-house activity

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

46 of 48 8/6/2010 12:09 PM

Page 76: Completed Academic Program Assessment Plans 20082009

Notes: Due to doubly unfortunate circumstances(crashed hard drive and boxes lost in move) findings arenot available for 2008-2009. Care has been taken formore secure preservation of data in future years.

Supporting Attachments:

VI.1 Culture

Mapped to:No Mapping

Measures & Findings

Essay and class discussionDetails/Description: In 300-level Latin courses, theemphasis will be on detailed knowledge of importantauthors, in terms of their literary style and modes ofexpressions, the genres in which they write, their literaryor intellectual contributions, and their influence on latercivilization. Two means of assessment will be utilized.First, there will be a cumulative group discussion of theauthor near the end of the semester, in which theinstructor notes the number of students who bring forthinformed and meaningful contributions. It is expectedthat 75% will do so. Secondly, the students will eachwrite a term paper on one of these areas (i.e. theauthor’s literary qualities, ideas, influence, etc.), and itwill be expected that at least 75% reveal anunderstanding of their chosen subject that the instructorconsiders to be substantial.

Target: %75 of students will meet the standard.

Implementation Plan (timeline): Although theinstructor will regularly introduce cultural elements intothe beginning and intermediate language classes inorder to pave the way, actual assignments, measures,and assessments in this area will wait until the300-level CLAS courses due to the great amount oflanguage work that needs to be accomplished in the100-200 level Latin courses.

Key/Responsible Personnel: Dr. Joshua Davies

Notes: Due to doubly unfortunate circumstances(crashed hard drive and boxes lost in move) findings arenot available for 2008-2009. Care has been taken formore secure preservation of data in future years.

Supporting Attachments:

No Findings Added to Essay and class discussion

VI.2 Culture

Mapped to:No Mapping

Measures & Findings

EssaysDetails/Description: The 300-level Classics courses(e.g. CLAS 396: Classical Mythology; CLAS 310: TheGreco-Roman World; etc.) emphasize the broad linesand features of Classical civilization as a whole, fromthe divinities of ancient religion to social realities andliterary history. The essay exams and term papers of the

No Findings Added to Essays

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

47 of 48 8/6/2010 12:09 PM

Page 77: Completed Academic Program Assessment Plans 20082009

students will be used to assess their competency in thisarea. It will be expected that 75% of them willdemonstrate a substantial grasp of these broadercurrents.

Target: %75 of students will meet the standard of theirsemester-level.

Implementation Plan (timeline): Although theinstructor will regularly introduce cultural elements intothe beginning and intermediate language classes inorder to pave the way, actual assignments, measures,and assessments in this area will wait until the300-level CLAS courses due to the great amount oflanguage work that needs to be accomplished in the100-200 level Latin courses.

Key/Responsible Personnel: Dr. Joshua Davies

Notes: Due to doubly unfortunate circumstances(crashed hard drive and boxes lost in move) findings arenot available for 2008-2009. Care has been taken formore secure preservation of data in future years.

Supporting Attachments:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

48 of 48 8/6/2010 12:09 PM

Page 78: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Geology

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Geology

Outcomes

Graduates will have a generalknowledge of geology

Mapped to:No Mapping

Measures & Findings

ACAT: overall scoresProgram level; Direct - Exam

Details/Description: Prospective graduates will take theArea Concentration Achievement Test (ACAT) for geologyafter completing the geology curriculum. The ACAT willinclude 4 content areas reflective of geology corecourses—mineralogy, petrology, stratigraphy, andstructural geology.

Target: The mean of overall scores on the ACAT forgraduating seniors will be at or above the 50thpercentile compared to national norms.

Implementation Plan (timeline): The ACAT will beadministered by OPEIR staff during April of 2009.Results are expected during mid to late summer, 2009,and will be included in the 2008-2009 outcomesassessment report.

Key/Responsible Personnel: Jonathan Mies

Supporting Attachments:

Findings for ACAT: overall scores

Summary of Findings: Five prospective graduates tookthe ACAT in early April 2009. Overall scores aregenerally good, 474, 446, 542, 522, and 583, the meanof which is 513. [Standard scores are based on mean =500, std. dev. = 100.] The individual scores correspondto the 40th, 29th, 66th, 59th, and 80th %tiles and the4th, 4th, 6th, 5th, and 7th stanines. Three of the fivescores are above the mean for the national comparisongroup, as is the mean of all five scores.

These results indicate that this objective was met.

These results are somewhat worse than those of lastyear (2007-2008 cycle, mean overall score = 574, n=6),despite there having been no change made to thegeology curriculum. While we are pleased that we metour objective, we wish these scores were better. Werecognize that overall scores for such few numbers ofstudents are not statistically significant.

Target Achievement: Met

Recommendations : We continue our efforts to improveupon ACAT test scores by requiring students to takecumulative and comprehensive exams in each course.

Notes :

Substantiating Evidence:

Graduates will have knowledge ofmineralogy Measures & Findings

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 7 8/6/2010 12:16 PM

Page 79: Completed Academic Program Assessment Plans 20082009

Mapped to:No Mapping

ACAT: mineralogy scoresProgram level; Direct - Exam

Details/Description: Prospective graduates will take theArea Concentration Achievement Test (ACAT) for geologyafter completing the geology curriculum. The ACAT willinclude 4 content areas reflective of geology corecourses—mineralogy, petrology, stratigraphy, andstructural geology.

Target: The mean of mineralogy content-area scores onthe ACAT for graduating seniors will be at or above the50th percentile compared to national norms.

Implementation Plan (timeline): The ACAT will beadministered by OPEIR staff during April of 2009.Results are expected during mid to late summer, 2009,and will be included in the 2008-2009 outcomesassessment report.

Key/Responsible Personnel: Jonathan Mies

Supporting Attachments:

Findings for ACAT: mineralogy scores

Summary of Findings: Five prospective graduates tookthe ACAT in early April 2009. Mineralogy content-areascores are generally good, 523, 583, 488, 523, and 458.[Standard scores are based on mean = 500, std. dev. =100.] The mean of mineralogy content-area scores is515, which corresponds to the 56th %tile.

These results indicate that this objective was met.

These results are somewhat worse than those of lastyear (2007-2008 cycle, mean mineralogy content-areascore = 588, n=6), despite there having been no changemade to the mineralogy curriculum. While we arepleased that we met our objective, we wish these scoreswere better. We recognize that overall scores for suchfew numbers of students are not statistically significant.

Target Achievement: Met

Recommendations : We continue our efforts to improveupon ACAT mineralogy content-area scores by requiringstudents to take cumulative and comprehensive examsin Mineralogy.

Notes :

Substantiating Evidence:

Graduates will have knowledge ofpetrology

Mapped to:No Mapping

Measures & Findings

ACAT: petrology scoresProgram level; Direct - Exam

Details/Description: Prospective graduates will take theArea Concentration Achievement Test (ACAT) for geologyafter completing the geology curriculum. The ACAT willinclude 4 content areas reflective of geology corecourses—mineralogy, petrology, stratigraphy, andstructural geology.

Target: The mean of petrology content-area scores onthe ACAT for graduating seniors will be at or above the50th percentile compared to national norms.

Implementation Plan (timeline): The ACAT will beadministered by OPEIR staff during April of 2009.Results are expected during mid to late summer, 2009,and will be included in the 2008-2009 outcomesassessment report.

Key/Responsible Personnel: Jonathan Mies

Supporting Attachments:

Findings for ACAT: petrology scores

Summary of Findings: Five prospective graduates tookthe ACAT in early April 2009. Petrology content-areascores are 534, 361, 398, 436, and 523. [Standardscores are based on mean = 500, std. dev. = 100.] Twoof these are quite good and exceed the mean for thenational comparison group, while 3 are low. The meanpetrology content-are score is 450, which corresponds tothe 31st %tile.

These results indicate that this objective was not met.

These results are somewhat worse than those of lastyear (2007-2008 cycle, mean petrology content-areascore = 497, n=6), despite there having been no changemade to the petrology curriculum. We recognize thatoverall scores for such few numbers of students are notstatistically significant.

Target Achievement: Not Met

Recommendations : We continue our efforts to improveupon ACAT petrology content-area scores by requiringstudents to take cumulative and comprehensive examsin Petrology.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 7 8/6/2010 12:16 PM

Page 80: Completed Academic Program Assessment Plans 20082009

Notes :

Substantiating Evidence:

Graduates will have knowledge ofstratigraphy

Mapped to:No Mapping

Measures & Findings

ACAT: stratigraphy scoresProgram level; Direct - Exam

Details/Description: Prospective graduates will take theArea Concentration Achievement Test (ACAT) for geologyafter completing the geology curriculum. The ACAT willinclude 4 content areas reflective of geology corecourses—mineralogy, petrology, stratigraphy, andstructural geology.

Target: The mean of stratigraphy content-area scoreson the ACAT for graduating seniors will be at or abovethe 50th percentile compared to national norms.

Implementation Plan (timeline): The ACAT will beadministered by OPEIR staff during April of 2009.Results are expected during mid to late summer, 2009,and will be included in the 2008-2009 outcomesassessment report.

Key/Responsible Personnel: Jonathan Mies

Supporting Attachments:

Findings for ACAT: stratigraphy scores

Summary of Findings: Five prospective graduates tookthe ACAT in early April 2009. Stratigraphy content-areascores are generally good, 404, 367, 578, 493, and 622.[Standard scores are based on mean = 500, std. dev. =100.] The mean of stratigraphy content-area scores is513, which corresponds to the 55th %tile.

These results indicate that this objective was met.

These results are somewhat worse than those of lastyear (2007-2008 cycle, mean stratigraphy content-areascore = 534, n=6), despite there having been no changemade to the stratigraphy curriculum. While we arepleased that we met our objective, we wish these scoreswere better. We recognize that overall scores for suchfew numbers of students are not statistically significant.

Target Achievement: Met

Recommendations : We continue our efforts to improveupon ACAT stratigraphy content-area scores by requiringstudents to take cumulative and comprehensive examsin Historical Geology and Sedimentary Rocks andStratigraphy.

Notes :

Substantiating Evidence:

Graduates will have knowledge ofstructural geology

Mapped to:No Mapping

Measures & Findings

ACAT: structural geology scoresProgram level; Direct - Exam

Details/Description: Prospective graduates will take theArea Concentration Achievement Test (ACAT) for geologyafter completing the geology curriculum. The ACAT willinclude 4 content areas reflective of geology corecourses—mineralogy, petrology, stratigraphy, andstructural geology.

Target: The mean of structural geology content-areascores on the ACAT for graduating seniors will be at orabove the 50th percentile compared to national norms.

Implementation Plan (timeline): The ACAT will beadministered by OPEIR staff during April of 2009.Results are expected during mid to late summer, 2009,and will be included in the 2008-2009 outcomesassessment report.

Findings for ACAT: structural geology scores

Summary of Findings: Five prospective graduates tookthe ACAT in early April 2009. Structural geologycontent-area scores are generally good, 460, 520, 640,507, and 627. [Standard scores are based on mean =500, std. dev. = 100.] The mean of structural geologycontent-area scores is 551, which corresponds to the69th %tile.

Target Achievement: Exceeded

Recommendations : We continue our efforts to improveupon ACAT structural geology content-area scores byrequiring students to take cumulative andcomprehensive exams in Structural Geology.

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 7 8/6/2010 12:16 PM

Page 81: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: Jonathan Mies

Supporting Attachments:

Substantiating Evidence:

Graduates will be satisfied witheducation and training

Mapped to:No Mapping

Measures & Findings

Exit interviewProgram level; Indirect - Interview

Details/Description: In lieu of or in addition tocompleting an exit questionnaire (see Measure: Exitquestionnaire), prospective graduates will be invited tooffer their opinions and constructive criticisms of thegeology program during an exit interview by thedepartment head.

Target: All interviewed graduates will indicate that theyare satisfied with the education and training that theyreceived as a student in the geology program.

Implementation Plan (timeline): Prospective graduateswill be invited to discuss the geology program with thedepartment head at or near the time of their graduation.

Key/Responsible Personnel: Habte Churnet(Department Head), Jonathan Mies

Supporting Attachments:

Findings for Exit interview

Summary of Findings: Although no formal interviewswere conducted, the department head did receivecomments from upper-level students including theprospective graduates. Comments were generallypositive.

Target Achievement:

Recommendations : We will make greater effort in thefuture to encourage students to discuss the programwith the department head.

Notes :

Substantiating Evidence:

Exit questionnaireProgram level; Indirect - Survey

Details/Description: Prospective graduates willcomplete an exit questionnaire designed jointly bygeology faculty and OPEIR.

Target: Prospective graduates will agree or stronglyagree with each of the following statements: (1) I amsatisfied with the education and training that I receivedas a student in the geology program at UTC; (2) I amsatisfied with the academic advisement that Ireceivedas a student in the geology program at UTC; (3) geologyfaculty at UTC convey an in-depth knowledge of thesubjects that they teach; (4) geology faculty at UTCrelate to students in an academically productive way;and (5) when called upon, geology faculty at UTC arewilling to help students.

Implementation Plan (timeline): Prospective graduateswill complete the exit questionnaire at or near the timeof their graduation and will do so anonymously.Students will deliver completed questionnaires directlyto the department secretary, who will provide them toresponsible personnel during mid to late summer forinclusion in the 2007-2008 outcomes assessment report.

Key/Responsible Personnel: Jonathan Mies

Supporting Attachments:

Findings for Exit questionnaire

Summary of Findings: Four (of 6) prospective graduatescompleted the exit questionnaire.

All respondents indicated satisfaction with theeducation, training, and academic advisement that theyreceived and strongly agreed that faculty areknowledgeable in their respective subject areas, relatewell to students, and are willing to help students.

One respondent was critical of the program's facilities.Two respondents were complimentary of how well facultyrelate to students and the extent to which they makethemselves available to students.

Target Achievement: Exceeded

Recommendations : We will continue to work to ensurethat students are satisfied with the education andtraining that they receive.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 7 8/6/2010 12:16 PM

Page 82: Completed Academic Program Assessment Plans 20082009

Enhanced competencies due to GEOL111 (General Education)

Mapped to:No Mapping

Measures & Findings

GEOL 111: pretest/post-testCourse level; Direct - Exam

Details/Description: A subset of questions from thefinal exam for Geology 111 will also constitute a pretestto be administered at the beginning of the semester. Aspart of the final exam, these questions will be graded asa post-test for each student. Pretest and post-testquestions, if not identical, will be very similar, and willbe the same for all sections of Geology 111.

NOTE: THIS OUTCOME AND ASSESSMENT IS PROPOSEDFOR INITIAL IMPLEMENTATION DURING THE FALL OF2009 OR SPRING OF 2010 (2009-2010 OUTCOMESCYCLE).

Target: The average of students’ scores for thepost-test (T2) will be at least 50 relative percent betterthan the average of students’ scores for the pretest(T1), as calculated by [((T2-T1)/T1) ≥ 0.50].

Implementation Plan (timeline): The pretest andpost-test will be administered to all sections of Geology111 during fall semester of 2009 and the springsemester of 2010. The pretest will be administered atthe beginning of each semester, within the first 3 hoursof class. The post-test will be part of the final exam, tobe administered at the end of each semester. Resultswill be included in the 2009-2010 outcomes assessmentreport.

UPDATE: The pretest was administered to all sections ofGeology 111 during the present semester (fall 2009).Average scores are as follows: section 001, 46%;section 002, 40%; section 003, 42%; section 004, 48%.These results are very similar to those of a pilot studyconducted during the spring semester of 2008, in whichthe average score on the pretest was 40%; the averagescore on the post-test was 72%, which is 80 relativepercent better than the that of the pretest.

Key/Responsible Personnel: class instructor (Brodie,Churnet, Holmes, Mies, Williams), Jonathan Mies

Supporting Attachments:

No Findings Added to GEOL 111: pretest/post-test

Enhanced competencies due to GEOL112 (General Education)

Mapped to:No Mapping

Measures & Findings

GEOL 112: pretest/post-testCourse level; Direct - Exam

Details/Description: A subset of questions from thefinal exam for Geology 112 will also constitute a pretestto be administered at the beginning of the semester. Aspart of the final exam, these questions will be graded asa post-test for each student. Pretest and post-testquestions, if not identical, will be very similar, and will

No Findings Added to GEOL 112: pretest/post-test

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

5 of 7 8/6/2010 12:16 PM

Page 83: Completed Academic Program Assessment Plans 20082009

be the same for all sections of Geology 112.

NOTE: THIS OUTCOME AND ASSESSMENT IS PROPOSEDFOR INITIAL IMPLEMENTATION DURING THE FALL OF2009 OR SPRING OF 2010 (2009-2010 OUTCOMESCYCLE).

Target: The average of students’ scores for thepost-test (T2) will be at least 50 relative percent betterthan the average of students’ scores for the pretest(T1), as calculated by [((T2-T1)/T1) ≥ 0.50].

Implementation Plan (timeline): The pretest andpost-test will be administered to all sections of Geology112 during the spring semester of 2010. The pretest willbe administered at the beginning of each semester,within the first 3 hours of class. The post-test will bepart of the final exam, to be administered at the end ofeach semester. Results will be included in the2009-2010 outcomes assessment report.

Key/Responsible Personnel: class instructor (Brodie,Churnet, Holmes, Mies, Williams), Jonathan Mies

Supporting Attachments:

Enhanced competencies due to GEOL116 (General Education)

Mapped to:No Mapping

Measures & Findings

GEOL 116: pretest/post-testCourse level; Direct - Exam

Details/Description: A subset of questions from thefinal exam for Geology 116 will also constitute a pretestto be administered at the beginning of the semester. Aspart of the final exam, these questions will be graded asa post-test for each student. Pretest and post-testquestions, if not identical, will be very similar, and willbe the same for all sections of Geology 116.

NOTE: THIS OUTCOME AND ASSESSMENT IS PROPOSEDFOR INITIAL IMPLEMENTATION DURING THE FALL OF2009 OR SPRING OF 2010 (2009-2010 OUTCOMESCYCLE).

Target: The average of students’ scores for thepost-test (T2) will be at least 50 relative percent betterthan the average of students’ scores for the pretest(T1), as calculated by [((T2-T1)/T1) ≥ 0.50].

Implementation Plan (timeline): The pretest andpost-test will be administered to all sections of Geology116 during the spring semester of 2010. The pretest willbe administered at the beginning of each semester,within the first 3 hours of class. The post-test will bepart of the final exam, to be administered at the end ofeach semester. Results will be included in the2009-2010 outcomes assessment report.

Key/Responsible Personnel: class instructor (Brodie,Churnet, Holmes, Mies, Williams), Jonathan Mies

No Findings Added to GEOL 116: pretest/post-test

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

6 of 7 8/6/2010 12:16 PM

Page 84: Completed Academic Program Assessment Plans 20082009

Supporting Attachments:

Enhanced competencies due to GEOL225 (General Education)

Mapped to:No Mapping

Measures & Findings

GEOL 225: pretest/post-testCourse level; Direct - Exam

Details/Description: A subset of questions from thefinal exam for Geology 225 will also constitute a pretestto be administered at the beginning of the semester. Aspart of the final exam, these questions will be graded asa post-test for each student. Pretest and post-testquestions, if not identical, will be very similar, and willbe the same for all sections of Geology 225.

NOTE: THIS OUTCOME AND ASSESSMENT IS PROPOSEDFOR INITIAL IMPLEMENTATION DURING THE FALL OF2009 OR SPRING OF 2010 (2009-2010 OUTCOMESCYCLE).

Target: The average of students’ scores for thepost-test will be at least 20 relative percent better[(T2-T1)/((T1+T2)/2) ≥ 0.20] than the average ofstudents’ scores for the pretest.

Implementation Plan (timeline): The pretest andpost-test will be administered to all sections of Geology225 during the spring semester of 2010. The pretest willbe administered at the beginning of each semester,within the first 3 hours of class. The post-test will bepart of the final exam, to be administered at the end ofeach semester. Results will be included in the2009-2010 outcomes assessment report.

Key/Responsible Personnel: class instructor (Brodie,Churnet, Holmes, Mies, Williams), Jonathan Mies

Supporting Attachments:

No Findings Added to GEOL 225: pretest/post-test

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

7 of 7 8/6/2010 12:16 PM

Page 85: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Legal Assistant Studies: BS

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

1. Mastery of Practical Skills

Outcomes

1.1 Practical Skills required for Career

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 1.1Service Learning, Measurable Outcome 1.3 ExperientialLearning Opportunities, Measurable Outcome 2.1Distinctive Experience Outside Class, MeasurableOutcome 2.2 Student Satisfaction, MeasurableOutcome 3.9 Student Satisfaction

Measures & Findings

Practical Skills Gained in University and MajorCoursesProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation and InstitutionalResearch.

Target: The Legal Assistant Studies majors will achievea mean score of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s National Surveyof Student Engagement which assesses whether astudent’s experience at UTC led them to acquire job orwork related knowledge and skills.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll facultyMcGuffee

Supporting Attachments:

Findings for Practical Skills Gained in Universityand Major Courses

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 2.75 on this item.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

1.2 Student Internships

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 1.1Service Learning, Measurable Outcome 1.3 ExperientialLearning Opportunities, Measurable Outcome 2.1Distinctive Experience Outside Class, MeasurableOutcome 2.2 Student Satisfaction

Measures & Findings

Practical Skills Gained in Internship ProgramProgram level; Direct - Student Artifact

Details/Description: This item is measured by studentrecords kept by the Internship Coordinator.

Target: At least 75% of the internships will result is agrade of C or better (as determined by the internshipcoordinator with input from the field supervisor).

Implementation Plan (timeline): 2008-09

Findings for Practical Skills Gained in InternshipProgram

Summary of Findings: 82% of the students completinginternships received a C or better.

Target Achievement: Exceeded

Recommendations :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 11 8/6/2010 12:07 PM

Page 86: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: McGuffee

Supporting Attachments:

Notes :

Substantiating Evidence:

1.3 Service Learning

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 1.1Service Learning, Measurable Outcome 1.3 ExperientialLearning Opportunities, Measurable Outcome 2.1Distinctive Experience Outside Class

Measures & Findings

Service and/or Experiential LearningProgram level; Direct - Student Artifact

Details/Description: This item is measured by studentrecords kept by the Internship Coordinator.

Target: 100% of Legal Assistant Studies majors willcomplete an internship.

Implementation Plan (timeline): 08-09

Key/Responsible Personnel: McGuffee

Supporting Attachments:

Findings for Service and/or Experiential Learning

Summary of Findings: 100% of Legal Assistant Studiesmajors completed an internship.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

1.4 Applying Theory to PracticalProblems

Mapped to:No Mapping

Measures & Findings

Applying Practical ProblemsProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation and InstitutionalResearch.

Target: The Legal Assistant Studies majors will achievea mean score of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s National Surveyof Student Engagement which assesses whether astudent’s experience at UTC emphasized applyingtheories or concepts to practical problems.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program facultyMcGuffee

Supporting Attachments:

Findings for Applying Practical Problems

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 3.2 on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

2. Mastery of Writing Skills

Outcomes

2.1 Student Assessment of WritingSkills

Mapped to:Strategic Initiative: Partnerships for Students

Measures & Findings

Writing SkillsProgram level; Indirect - Survey

Details/Description: This survey is administered by the

Findings for Writing Skills

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 11 8/6/2010 12:07 PM

Page 87: Completed Academic Program Assessment Plans 20082009

[Teaching & Learning]: Measurable Outcome 2.2Student Satisfaction

Office of Planning, Evaluation, and InstitutionalResearch.

Target: The Legal Assistant Studies program willachieve a mean score of 2.5 or higher on the Office ofPlanning, Evaluation, and Institutional Research’sNational Survey of Student Engagement which assesseswhether a student’s experience at UTC added to theirability to write clearly and effectively.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program staffMcGuffee

Supporting Attachments:

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 2.75 on this item.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

2.2 Legal Research and Writing Skills

Mapped to:No Mapping

Measures & Findings

Mastery of Legal Research and Writing SkillsCourse level; Direct - Student Artifact

Details/Description: This data is obtained by studentrecords kept by the faculty member teaching LAS 390.

Target: At least 75% of the LAS students will receive agrade of C or better in the Advanced Legal Research andWriting Course (LAS 390).

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: LAS 390 InstructorMcGuffee

Supporting Attachments:

Findings for Mastery of Legal Research and WritingSkills

Summary of Findings: 96% of the LAS studentsreceived a grade of C or better in the Advanced LegalResearch and Writing course.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

3. Mastery of Computer Skills

Outcomes

3.1 Student Assessment of ComputerSkills

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 2.2Student Satisfaction

Measures & Findings

Student Assessment of Computer SkillsProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation and InstitutionalResearch.

Target: The Legal Assistant Studies program willachieve a mean score of 2.5 or higher on the Office ofPlanning, Evaluation, and Institutional Research EnrolledStudent Survey that assesses whether a student usedan electronic medium to discuss or complete anassignment; worked on an assignment where they used

Findings for Student Assessment of ComputerSkills

Summary of Findings: The Legal Assistant Studiesmajors had a means of 3.2, 3.0, and 2.75 on theseitems.

Target Achievement: Exceeded

Recommendations :

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 11 8/6/2010 12:07 PM

Page 88: Completed Academic Program Assessment Plans 20082009

a computer; and contributed to their ability to usecomputing and informational technology.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program facultyMcGuffee

Supporting Attachments:

Substantiating Evidence:

3.2 Demonstrate Computer Skills

Mapped to:No Mapping

Measures & Findings

Demonstrate Computer Skills needed in aLegal OfficeCourse level; Direct - Student Artifact

Details/Description: This data is obtained from studentrecords kept by the faculty member teaching LAS 360.

Target: At least 75% of the Legal Assistant Studiesmajors will receive a grade of B or better in the LawOffice Management and Computer Applications Course(LAS 360).

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: LAS 360 InstructorMcGuffee

Supporting Attachments:

Findings for Demonstrate Computer Skills neededin a Legal Office

Summary of Findings: 93.5% of the LAS studentsreceived a grade of B or higher in the LAS 360- LawOffice Management and Computer Applications class.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

3.3 Using Computers and Technology

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 2.2Student Satisfaction

Measures & Findings

Computer UseProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Legal Assistant Studies majors will achievea mean score of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s National Surveyof Student Engagement which assesses whether astudent’s experience at UTC led them to report that theyhave the ability to use computer and informationtechnology.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program facultyMcGuffee

Supporting Attachments:

Findings for Computer Use

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 2.75 on this item.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 11 8/6/2010 12:07 PM

Page 89: Completed Academic Program Assessment Plans 20082009

4. Mastery of Oral CommunicationSkills

Outcomes

4.1 Reported use of Oral CommunicationSkills

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 2.2Student Satisfaction

Measures & Findings

Reported Oral Communication SkillsProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Legal Assistant Studies program willachieve a mean score of 2.5 or higher on the Office ofPlanning, Evaluation, and Institutional Research’sEnrolled Student Survey that assesses whether astudent asked questions in class or contributed to classdiscussions, and whether they made a classpresentation.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program facultyMcGuffee

Supporting Attachments:

Findings for Reported Oral Communication Skills

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 2.6 on both of these items.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

4.2 Demonstrate Oral CommunicationSkills

Mapped to:No Mapping

Measures & Findings

Oral PresentationProgram level; Direct - Student Artifact

Details/Description: This data is obtained from studentrecords kept by the faculty member teaching LAS 471.

Target: At least 75% of the Legal Assistant Studiesstudents will receive a grade of C or better on thepaper/presentation required in the Legal Ethics Course(471).

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: McGuffee

Supporting Attachments:

Findings for Oral Presentation

Summary of Findings: 89.7% of the LAS studentsreceived a grade of C or better on thepaper/presentation required in the Legal Ethics andProfessionalism class.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

4.3 Speak Effectively and Clearly

Mapped to:No Mapping

Measures & Findings

Speak EffectivelyProgram level; Indirect - Survey

Details/Description: This survey is administered by the

Findings for Speak Effectively

Summary of Findings: The Legal Assistant Studies

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

5 of 11 8/6/2010 12:07 PM

Page 90: Completed Academic Program Assessment Plans 20082009

Office of Planning, Evaluation, and InstitutionalResearch.

Target: The Legal Assistant Studies majors will achievea mean score of 2.5 or higher on the Office of Planning,Evaluation, and Institutional Research’s National Surveyof Student Engagement which assesses whether astudent’s experience at UTC led them to report they canspeak clearly and effectively.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll Program FacultyMcGuffee

Supporting Attachments:

majors had a mean of 2.5 on this item.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

5. Mastery of Critical ThinkingSkills

Outcomes

5.1 Integration of Ideas

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 2.2Student Satisfaction

Measures & Findings

Integrating Material from Various SourcesProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Legal Assistant Studies program willachieve a mean score of 2.5 or higher on the Office ofPlanning, Evaluation, and Institutional Research’sEnrolled Student Survey that assesses whether astudent integrated ideas or information from varioussources and courses, and during class discussions.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchProgram facultyMcGuffee

Supporting Attachments:

Findings for Integrating Material from VariousSources

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 3.20 and 2.80 on these items.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

5.2 Analyzing and SynthesizingInformation

Mapped to:No Mapping

Measures & Findings

Analyzing and SynthesizingProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and Institutional

Findings for Analyzing and Synthesizing

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 3.20 and 2.80 on these items.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

6 of 11 8/6/2010 12:07 PM

Page 91: Completed Academic Program Assessment Plans 20082009

Research.

Target: The Legal Assistant Studies program willachieve a mean score of 2.5 or higher on the Office ofPlanning, Evaluation, and Institutional Research’sEnrolled Student Survey that assesses whether astudent analyzed the basic elements of an idea,experience or theory and whether they synthesizedinformation into more complex ideas.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program facultyMcGuffee

Supporting Attachments:

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

5.3 Making Judgements

Mapped to:No Mapping

Measures & Findings

JudgementProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Legal Assistant Studies program willachieve a mean score of 2.5 or higher on the Office ofPlanning, Evaluation, and Institutional Research’sEnrolled Student Survey that assesses whether astudent made judgements about information and data.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program facultyMcGuffee

Supporting Attachments:

Findings for Judgement

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 3.20 on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

5.4 Reported Ability to Think Criticallyand Analytical

Mapped to:No Mapping

Measures & Findings

Ability to Think CriticallyProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Legal Assistant Studies program willachieve a mean score of 2.5 or higher on the Office ofPlanning, Evaluation, and Institutional Research’sEnrolled Student Survey that assesses whether astudent’s experience at UTC added to their ability tothink clearly and analytically.

Implementation Plan (timeline): 2008-09

Findings for Ability to Think Critically

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 2.50 on this item.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

7 of 11 8/6/2010 12:07 PM

Page 92: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program facultyMcGuffee

Supporting Attachments:

6. Student Retention

Outcomes

6.1 Quality Advising

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 2.4Strong Commitment to Program, Measurable Outcome2.5 Student Engagement

Measures & Findings

Student Assessment of AdvisingProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Legal Assistant Studies students willachieve a mean score of 2.5 or higher on the Office ofPlanning, Evaluation, and Institutional Research’sEnrolled Student Survey that assesses the quality ofadvising they received at UTC.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchMcGuffee

Supporting Attachments:

Findings for Student Assessment of Advising

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 3.33 on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

6.2 Scheduling

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 2.7Retention & Persistence, Measurable Outcome 4.4Retention/Graduation

Measures & Findings

Course Rotation PlanProgram level; Direct - Other

Details/Description: This data is taken from the classschedules for the year.

Target: 100% of all required courses will be offered atleast once a year.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: McGuffee

Supporting Attachments:

Findings for Course Rotation Plan

Summary of Findings: 100% of all required courseswere offered at least once a year.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

6.3 Quality of Relationship betweenStudents and Faculty

Mapped to:

Measures & Findings

Student Faculty RelationshipsProgram level; Indirect - Survey

Findings for Student Faculty Relationships

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

8 of 11 8/6/2010 12:07 PM

Page 93: Completed Academic Program Assessment Plans 20082009

Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 2.4Strong Commitment to Program, Measurable Outcome2.5 Student Engagement

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Legal Assistant Studies program willachieve a mean score of 4.0 or higher on the Office ofPlanning, Evaluation, and Institutional Research’sEnrolled Student Survey that assesses studentsperceptions about the quality of relationships betweenfaculty and students.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program facultyMcGuffee

Supporting Attachments:

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 5.0 on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

7. Student Satisfaction

Outcomes

7.1 Student Satisfaction with OverallQuality

Mapped to:Strategic Initiative: Partnerships for Students[Teaching & Learning]: Measurable Outcome 3.9Student Satisfaction

Measures & Findings

Overall Student SatisfactionProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Legal Assistant Studies program willachieve a mean score of 2.5 or higher on the Office ofPlanning, Evaluation, and Institutional Research’sEnrolled Student Survey that assesses the entireeducational experience.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll program facultyMcGuffee

Supporting Attachments:

Findings for Overall Student Satisfaction

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 3.0 on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

8. Exposure to Diversity

Outcomes

8.1 Exposure to Diverse People andPerspectives Measures & Findings

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

9 of 11 8/6/2010 12:07 PM

Page 94: Completed Academic Program Assessment Plans 20082009

Mapped to:Strategic Initiative: Partnerships for Diversity:Measurable Outcome 1.5 Increased Tolerance

Diversity Experiences in Person or in ClassProgram level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Legal Assistant Studies program willachieve a mean score of 2.5 or higher on the Office ofPlanning, Evaluation, and Institutional Research’sEnrolled Student Survey that assesses whether theyincluded diverse perspectives in class discussions orwriting assignments and had serious conversations withstudents of a different race/ethnicity.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: Office of Planning,Evaluation, and Institutional ResearchAll Program FacultyMcGuffee

Supporting Attachments:

Findings for Diversity Experiences in Person or inClass

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 2.60 on both of these items.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

8.2 Increased Understanding of DiverseGroups

Mapped to:Strategic Initiative: Partnerships for Diversity:Measurable Outcome 1.5 Increased Tolerance

Measures & Findings

Understanding Other Races/EthnicityInstitution level; Indirect - Survey

Details/Description: This survey is administered by theOffice of Planning, Evaluation, and InstitutionalResearch.

Target: The Legal Assistant Studies program willachieve a mean score of 2.5 or higher on the Office ofPlanning, Evaluation, and Institutional Research’sEnrolled Student Survey that assesses whether they hadan increased understanding of people from otherracial/ethnic backgrounds.

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: All program faculty.McGuffee

Supporting Attachments:

Findings for Understanding Other Races/Ethnicity

Summary of Findings: The Legal Assistant Studiesmajors had a mean of 3.00 on this item.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

8.3 Integration of Ethical Issues in theCurriculum

Mapped to:Strategic Initiative: Partnerships for Diversity:Measurable Outcome 1.5 Increased Tolerance

Measures & Findings

Participate in Ethics ClassCourse level; Direct - Other

Details/Description: The data will be taken from thestudent records kept by the Coordinator.

Target: 100% of graduating majors will take an ethicsclass in legal assistant studies (LAS471).

Implementation Plan (timeline): 2008-09

Key/Responsible Personnel: McGuffee

Findings for Participate in Ethics Class

Summary of Findings: 100% of all graduating majorstook the ethics (470) course.

Target Achievement: Met

Recommendations :

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

10 of 11 8/6/2010 12:07 PM

Page 95: Completed Academic Program Assessment Plans 20082009

Supporting Attachments: Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

11 of 11 8/6/2010 12:07 PM

Page 96: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Music

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

BM Music

Outcomes

Recital

Mapped to:No Mapping

Measures & Findings

RecitalProgram level; Direct - Other

Details/Description: Students will pass the DivisionJury pre-recital hearing.

Target: 80%

Implementation Plan (timeline): Fall and Spring

Key/Responsible Personnel: Division Jury committee

Supporting Attachments:

Findings for Recital

Summary of Findings: 100% of students passed theDivision Jury pre-recital hearing.

Target Achievement: Exceeded

Recommendations : Continue to communicateexpectations via studio instructors and Music StudentHandbook.

Notes :

Substantiating Evidence:

Score Reading

Mapped to:No Mapping

Measures & Findings

Score ReadingProgram level; Direct - Exam

Details/Description: Students enrolled in MUS 303,Basic Conducting, will pass a music readingexamination, a portion of the final examination for thecourse.

Target: 80%

Implementation Plan (timeline): Fall

Key/Responsible Personnel: Conducting faculty

Supporting Attachments:

Findings for Score Reading

Summary of Findings: All students enrolled in MUS 303,Basic Conducting, passed the final examination.

Target Achievement: Exceeded

Recommendations : Continue providing conducting labcomponent in Music Seminar course to provideconducting opportunities for students enrolled in MUS303, 310 and 328.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 3 8/6/2010 12:21 PM

Page 97: Completed Academic Program Assessment Plans 20082009

Listening Analysis

Mapped to:No Mapping

Measures & Findings

Listening AnalysisProgram level; Direct - Exam

Details/Description: Students enrolled in MUS 208,Theory II, will pass the final exam, which includes alistening portion.

Target: 80%

Implementation Plan (timeline): Spring

Key/Responsible Personnel: Theory II faculty

Supporting Attachments:

Findings for Listening Analysis

Summary of Findings: 100% of students enrolled inTheory II passed the final exam.

Target Achievement: Exceeded

Recommendations : Continue with current procedures

Notes :

Substantiating Evidence:

Continuation Standard - Theory

Mapped to:No Mapping

Measures & Findings

Theory ProficiencyProgram level; Direct - Exam

Details/Description: Students will pass a SophomoreTheory Proficiency Exam as a pre-requisite to upper leveltheory and conducting courses.

Target: 90%

Implementation Plan (timeline): Spring

Key/Responsible Personnel: Theory II faculty

Supporting Attachments:

Findings for Theory Proficiency

Summary of Findings: 90% of students enrolled in MUS208 passed the Sophomore Theory Proficiency Exam(Comprehensive) on the first attempt.

Target Achievement: Met

Recommendations : Continue to monitor exam results.

Notes :

Substantiating Evidence:

Continuation Standard - Performance

Mapped to:No Mapping

Measures & Findings

Continuation Standard - PerformanceProgram level; Direct - Other

Details/Description: Students will pass a Semester Endjury audition to be admitted to 300-level applied study.

Target: 90%

Implementation Plan (timeline): Fall and Spring

Key/Responsible Personnel: Applied Music Faculty

Supporting Attachments:

Findings for Continuation Standard - Performance

Summary of Findings: 90% of students auditioning for300-level applied music study passed on their firstattempt.

Target Achievement: Met

Recommendations : Continue to monitor jury results.

Notes :

Substantiating Evidence:

Master of Music Outcome Set

Outcomes

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 3 8/6/2010 12:21 PM

Page 98: Completed Academic Program Assessment Plans 20082009

Preparation for professional life

Mapped to:No Mapping

Measures & Findings

Professional KnowledgeProgram level; Direct - Exam

Details/Description: The Graduate Advisory Committee,appointed by the Department Head, will evaluate thecandidate's performance on the comprehensiveexamination and vote "pass" or "fail."

Target: 80% of graduate students will pass thecomprehensive oral and written examination on the firstattempt.

Implementation Plan (timeline): fall and spring

Key/Responsible Personnel: Graduate music faculty

Supporting Attachments:

Findings for Professional Knowledge

Summary of Findings: 100% of graduate studentspassed the comprehensive examination on the firstattempt.

Target Achievement: Exceeded

Recommendations : Communicate with graduatestudents and advisors on specific procedures andexpectations for the Comprehensive Examination viaconsultation and Graduate Music Student Handbook.

Notes :

Substantiating Evidence:

Thesis/project/recital

Mapped to:No Mapping

Measures & Findings

Publication and PerformanceProgram level; Direct - Other

Details/Description: All theses, projects and recitalswill be evaluated as "passed" or "not passed." GraduateAdvisory Committee and Division Jury will evaluate asappropriate.

Target: 100% of graduate students will pass

Implementation Plan (timeline): fall and spring

Key/Responsible Personnel: Graduate Music faculty

Supporting Attachments:

Findings for Publication and Performance

Summary of Findings: 100% of graduate students werepassed on their theses, projects and recitals.

Target Achievement: Met

Recommendations : No changes at this time.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 3 8/6/2010 12:21 PM

Page 99: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Philosophy & Religion

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Assets of technology. Learningenvironment

Outcomes

Construction of bibliography

Mapped to:No Mapping

Measures & Findings

Thesis BibliographyProgram level; Direct - Student Artifact

Details/Description: 1a. At least 80% of students inPHIL 351 (required of all majors) will prepare anappropriate bibliography for a research paper using book,journal, and internet sources.

1b. At least 90% of student completing the senior thesiswill prepare an appropriate bibliography using book,journal, and internet resources. Bibliographies will bejudged by members of the student's oral examinationcommittee.

Target: 80% success rate for students in PHIL 351 and90% success rate for students completing their seniortheses.

Implementation Plan (timeline): Yearly.

Key/Responsible Personnel: Faculty in Department.

Supporting Attachments:

Findings for Thesis Bibliography

Summary of Findings: 81.2% of students (13 of 16) inPHIL 351 prepared an appropriate bibliography

83.3% of students (5 of 6) completing their seniortheses prepared an appropriate bibliography

Target Achievement: Exceeded

Recommendations : Continue to emphasize theimportance of constructing appropriate bibliographies.

Notes :

Substantiating Evidence:

Development of writing skills

Mapped to:No Mapping

Measures & Findings

Writing SkillsProgram level; Direct - Exam

Details/Description: 75% or more of Philosophy andReligion graduates will score above the UTC mean onthe Writing Skills index of the College Base AchievementTest.

Target: 75% or more success rate.

Findings for Writing Skills

Summary of Findings: No data was available (no majorstook this part of the Writing Skills index of the CollegeBase Achievement Test.

Target Achievement:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 2 8/6/2010 12:23 PM

Page 100: Completed Academic Program Assessment Plans 20082009

Implementation Plan (timeline): Yearly

Key/Responsible Personnel: Department Head

Supporting Attachments:

Recommendations : Continue to consult the results ofthis portion of the test.

Notes :

Substantiating Evidence:

Development of reading and textualanalysis skills

Mapped to:No Mapping

Measures & Findings

Reading and Textual Analysis SkillsProgram level; Direct - Exam

Details/Description: 75% or more of Philosophy andReligion graduates will score above the UTC mean onthe Reading Analytically index of the College BaseAchievement Test.

Target: 75% success rate.

Implementation Plan (timeline): Yearly.

Key/Responsible Personnel: Department Head

Supporting Attachments:

Findings for Reading and Textual Analysis Skills

Summary of Findings: 66.6% of students (3 of 3 takingthe test) scored above the UTC mean on the ReadingAnalytically index of the College Base Achievement Test.

Target Achievement: Not Met

Recommendations : Work to improve the analyticalreading ability of our majors.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 2 8/6/2010 12:23 PM

Page 101: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Political Science & Public Administration

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Political Science & PublicAdministration Outcome Set

Outcomes

Outcome 1: American Studies

Mapped to:No Mapping

Measures & Findings

ACAT senior exit exam American StudiesProgram level; Direct - Exam

Details/Description: Senior exit exam mandatory forPOLS graduation

Target: 100% of the students test in this area. Score aminimum 50% average overall in proficiency in AmericanStudies section

Implementation Plan (timeline): At end of eachsemester

Key/Responsible Personnel: Departmental Secretary

Supporting Attachments:

Findings for ACAT senior exit exam AmericanStudies

Summary of Findings: Exit exam delivered to December2008 graduates:100% of students, total 10 students tested, 73% overallscore

Exit exam delivered to May 2009 graduates:100% of students, total 21 students tested, 67% overallscore

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome 2: International andComparative Government

Mapped to:No Mapping

Measures & Findings

International and Comparative GovernmentDirect - Exam

Details/Description: PACAT Test Administered by AustinPeay State University.

Target: Will be administered to graduating PoliticalScience majors whose concentration is International andComparative Studies.

Implementation Plan (timeline): Will be Administeredin the Spring

Findings for International and ComparativeGovernment

Summary of Findings: Students scored in the 58thpercentile.

Target Achievement:

Recommendations :

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 2 8/6/2010 12:24 PM

Page 102: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: Departmental Secretary

Supporting Attachments:

Substantiating Evidence:

Outcome 3: Critical Thinking

Mapped to:No Mapping

Measures & Findings

Critical ThinkingProgram level; Direct - Exam

Details/Description: Graduates in Political Science willbe able to critically analyze an article in a professionaljournal.

Target: Political Science majors are required to takePOLS 200: Research Methods.

Implementation Plan (timeline): Students in POLS 200read an atricle in a professional journal and show thatthey are able to 1. Identify the reserach hypothesis 2.Analyze and assess the sampling, analytical, andstatistical techniques used 3. Analyze and assess theapporpriateness of the author's conclusions.

Key/Responsible Personnel: Dr. Christopher Horne

Supporting Attachments:

Findings for Critical Thinking

Summary of Findings: 62% of the students in Dr.Horne's class scored 80% or above

Target Achievement:

Recommendations :

Notes :

Substantiating Evidence:

Outcome 4: Communication Skills

Mapped to:No Mapping

Measures & Findings

Communication SkillsProgram level; Direct - Portfolio

Details/Description: Graduates in Political Science shalldemonstrate the ability to write a research paper inwhich they analyze a political topic, using appropriatereferences and citations.

Target: Students in all 400-level political sciencecourses who are required to write a research paper.

Implementation Plan (timeline): Based on evaluationof a random sample of 50% of the papers, at least 50%will demonstrate an ability to satisfy at least four of thefollowing criteria:1. the author thesis2. an appropriate methodology3. level of organization4. proper grammar, syntax, and style5. proper use of citations and references

Key/Responsible Personnel: Dr. Fouad Moughrabi

Supporting Attachments:

Findings for Communication Skills

Summary of Findings: 72% of the random sampledeemed "satisfactory"28% deemed "unsatisfactory"

Target Achievement:

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 2 8/6/2010 12:24 PM

Page 103: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Psychology: BS

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Psychology: BS Outcome Set

Outcomes

1. Intended EducationalOutcome/Objective

Mapped to:No Mapping

Measures & Findings

1. Means of Assessment and Criteria forSuccessCourse level; Indirect - Survey

Details/Description: A supplemental survey of studentsat the middle and end of the semester will provide dataaddressing this outcome. Success will be achieved if atleast 70% of students report being at least somewhatsatisfied with the quality of the 202 course and lab, andindicate at least a moderate agreement that the courseis supportive, yet challenging.

Target:

Implementation Plan (timeline):

Key/Responsible Personnel:

Supporting Attachments:

Findings for 1. Means of Assessment and Criteriafor Success

Summary of Findings: At mid-semester, 100% ofresponding students were at least somewhat satisfiedwith the quality of the lecture, and 92% were at leastsomewhat satisfied with the quality of the lab. Ofrespondents, 92% also indicated the course waschallenging, but also that the professors and labinstructors were supportive.

At the end of the semester, 97.4% of students were atleast somewhat satisfied with the quality of the lectureand lab. In addition. 92.1% agreed at least somewhatthat the course was challenging overall, while 100%indicated that the professors and lab instructors weresupportive.

Target Achievement:

Recommendations :

Notes :

Substantiating Evidence:

2. Intended EducationalOutcome/Objective

Mapped to:No Mapping

Measures & Findings

2. Means of Assessment and Criteria forSuccessCourse level; Indirect - Survey

Details/Description: Supplemental survey results willindicate the degree to which students feel PSY 202 has

Findings for 2. Means of Assessment and Criteriafor Success

Summary of Findings: With respect to critical thinkingability, 83.3% of respondents at least somewhat agreed

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 2 8/6/2010 12:25 PM

Page 104: Completed Academic Program Assessment Plans 20082009

provided them with useful skills and knowledge to bemore effective critical thinkers and problem solvers inthe real world. Success will be achieved if at least 70%of reports indicate at least moderate agreement withthese statements.

Target:

Implementation Plan (timeline):

Key/Responsible Personnel:

Supporting Attachments:

that this course had improved their critical thinkingability. As for problem solving, 80.6% at least somewhatagreed that this course improved their ability to be areal-world problem solver.

Target Achievement:

Recommendations :

Notes :

Substantiating Evidence:

3. Intended EducationalOutcome/Objective

Mapped to:No Mapping

Measures & Findings

3. Means of Assessment and Criteria forSuccessCourse level; Indirect - Survey

Details/Description: Supplemental survey results willindicate that at least 70% of students find the overalllab experience in PSY 202 to be especially helpful totheir development of research skills in psychology.

Target:

Implementation Plan (timeline):

Key/Responsible Personnel:

Supporting Attachments:

Findings for 3. Means of Assessment and Criteriafor Success

Summary of Findings: Of respondents, 83.3% at leastsomewhat agreed that the PSY 202 lab was“especially helpful to the development of [your]psychology research skills.”

Target Achievement:

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 2 8/6/2010 12:25 PM

Page 105: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Psychology: MS

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Psychology: MS Outcome Set

Outcomes

1. Intended EducationalOutcome/Objective

Mapped to:No Mapping

Measures & Findings

1. Means of Assessment and Criteria forSuccessProgram level; Indirect - Survey

Details/Description: Results from an exit survey willindicate that a majority of graduating students at leastAgree with statements that indicate the curriculum hasprepared them for their next steps in the field of I-O.

Target:

Implementation Plan (timeline):

Key/Responsible Personnel:

Supporting Attachments:

Findings for 1. Means of Assessment and Criteriafor Success

Summary of Findings: Responses to the exit surveyfrom recently graduating students (N = 26) supports thisobjective as 91.8% of students Agreed or StronglyAgreed that the UTC program curriculum has preparedthem to take their next steps in the I-O field.

Target Achievement:

Recommendations :

Notes :

Substantiating Evidence:

2. Intended EducationalOutcome/Objective

Mapped to:No Mapping

Measures & Findings

2. Means of Assessment and Criteria forSuccessProgram level; Indirect - Survey

Details/Description: Results from an exit survey willshow that a majority of graduating students at leastAgree with statements suggesting that thecomprehensive exam process was worthwhile and helpedthem to consolidate their knowledge of I-O.

Target:

Implementation Plan (timeline):

Key/Responsible Personnel:

Supporting Attachments:

Findings for 2. Means of Assessment and Criteriafor Success

Summary of Findings: Only 5/26 respondents chose torespond to this particular question, but all indicated thatthe comprehensive exam process was worthwhile andhelped them to consolidate their I-O knowledge gainedin the UTC program.

Target Achievement:

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 2 8/6/2010 12:27 PM

Page 106: Completed Academic Program Assessment Plans 20082009

3. Intended EducationalOutcome/Objective

Mapped to:No Mapping

Measures & Findings

3. Means of Assessment and Criteria forSuccessProgram level; Indirect - Survey

Details/Description: Results from a supplemental endof semester teaching evaluation will indicate that atleast 70% of students feel their professors are at leastVery knowledgeable about the topics they teach andengaging in their presentation of materials within theclassroom.

Target:

Implementation Plan (timeline):

Key/Responsible Personnel:

Supporting Attachments:

Findings for 3. Means of Assessment and Criteriafor Success

Summary of Findings: 96.2% of all respondents Agreedor Strongly Agreed that professors were “veryknowledgeable about the topics they teach,” and88.5% Agreed or Strongly Agreed that professors were“engaging in their presentation of material inclass”.

Target Achievement:

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 2 8/6/2010 12:27 PM

Page 107: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Theatre & Speech

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Theatre & Speech Outcome Set07-08

Outcomes

Identify and Use Dramatic Action

Mapped to:No Mapping

Measures & Findings

Directed sceneCourse level; Direct - Other

Details/Description: Through directed scenes presentedin demonstration labs.

Target: 80% of students will score a 8 out of 10 on"interpretation" and "development of dramatic action"portions of the Dem-Lab.

Implementation Plan (timeline): End of year.

Key/Responsible Personnel: John Burgess, GayeJeffers, Steve Ray, Mac Smotherman

Supporting Attachments:

Findings for Directed scene

Summary of Findings: 1 Student scored 10/102 Students scored 9/102 Students scored 8/10

Target Achievement: Exceeded

Recommendations :

Notes : The assessment and findings are based solelyupon our Directing class which has in the past been usedas a "capstone" class. The current faculty does notbelieve this assessment plan is an effective means ofassessment for our current objectives and goals. We arecurrently revising all assessment tools, objectives, andoutcomes.

Substantiating Evidence:

Play AnalysisCourse level; Direct - Student Artifact

Details/Description: Through Play Analysis (Hodge) inpreparation for scenes directed for demonstration labs.

Target: 80% of students will score a 8 out of 10 on theanalysis.

Implementation Plan (timeline):

Key/Responsible Personnel: Gaye Jeffers

Supporting Attachments:

Findings for Play Analysis

Summary of Findings: 1 Student scored 10/101 Student scored 9/102 Students scored 8/101 Student scored 7/10

Target Achievement: Met

Recommendations :

Notes : The assessment and findings are based solelyupon our Directing class which has in the past been usedas a "capstone" class. The current faculty does notbelieve this assessment plan is an effective means ofassessment for our current objectives and goals. We are

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 5 8/6/2010 12:28 PM

Page 108: Completed Academic Program Assessment Plans 20082009

currently revising all assessment tools, objectives, andoutcomes.

Substantiating Evidence:

Understand Characterization

Mapped to:No Mapping

Measures & Findings

Directing scenesCourse level; Direct - Other

Details/Description: Through scenes directed fordemonstration labs.

Target: 80% of students will score 8/10 on thecharacterization portion of the demonstration labevaluation.

Implementation Plan (timeline): At least once eachsemester.

Key/Responsible Personnel: Gaye Jeffers

Supporting Attachments:

Findings for Directing scenes

Summary of Findings: 2 Students scored 10/101 Student scored 9/102 Students scored 8/10

Target Achievement: Exceeded

Recommendations :

Notes : The assessment and findings are based solelyupon our Directing class which has in the past been usedas a "capstone" class. The current faculty does notbelieve this assessment plan is an effective means ofassessment for our current objectives and goals. We arecurrently revising all assessment tools, objectives, andoutcomes.

Substantiating Evidence:

Roles Performed in Labs and ProductionsProgram level; Direct - Other

Details/Description: Through Roles performed indemonstration labs and UTC Theatre productions

Target: 80% of students with an acting focus will score8 out of 10 on Jury Critique.

Implementation Plan (timeline): Twice each semester.

Key/Responsible Personnel: John Burgess, GayeJeffers, Steve Ray, Mac Smotherman

Supporting Attachments:

Findings for Roles Performed in Labs andProductions

Summary of Findings: 1 Student scored 10/101 Student scored 9/103 Students scored 8/10

Target Achievement: Exceeded

Recommendations :

Notes : The assessment and findings are based solelyupon our Directing class which has in the past been usedas a "capstone" class. The current faculty does notbelieve this assessment plan is an effective means ofassessment for our current objectives and goals. We arecurrently revising all assessment tools, objectives, andoutcomes.

Substantiating Evidence:

Scenic Environment

Mapped to:No Mapping

Measures & Findings

Floor PlanCourse level; Direct - Student Artifact

Details/Description: Compose a floor plan and usescenic environment in directed scenes for demonstrationlab.

Target: 80% of students will score a 8 out of 10 on the

Findings for Floor Plan

Summary of Findings: 4 Students scored 9/101 Student scored 7/10

Target Achievement: Met

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 5 8/6/2010 12:28 PM

Page 109: Completed Academic Program Assessment Plans 20082009

"Scenic Environment" portion of Dem-Lab evaluation.

Implementation Plan (timeline):

Key/Responsible Personnel: Gaye Jeffers

Supporting Attachments:

Recommendations :

Notes : The assessment and findings are based solelyupon our Directing class which has in the past been usedas a "capstone" class. The current faculty does notbelieve this assessment plan is an effective means ofassessment for our current objectives and goals. We arecurrently revising all assessment tools, objectives, andoutcomes.

Substantiating Evidence:

Stage Management

Mapped to:No Mapping

Measures & Findings

Skills and disciplinesProgram level; Direct - Other

Details/Description: Organization and management of ademonstration lab rehearsal process and/or throughservice as a crew head and/or stage manager on a UTCTheatre production.

Target: 80% of students will score a 8 out of 10 on JuryCritique.

Implementation Plan (timeline):

Key/Responsible Personnel: John Burgess, GayeJeffers, Steve Ray, Mac Smotherman

Supporting Attachments:

Findings for Skills and disciplines

Summary of Findings: 1 Student scored 10/101 Student scored 9/101 Student scored 8/102 Students scored 7/10

Target Achievement: Not Met

Recommendations : Scheduling rehearsals iscomplicated by the lack of suitable space for students towork. Poor use of time or organization causes studentsto fall behind and affect others needing the space. Moresuitable rehearsal spaces are needed.

Notes : The assessment and findings are based solelyupon our Directing class which has in the past been usedas a "capstone" class. The current faculty does notbelieve this assessment plan is an effective means ofassessment for our current objectives and goals. We arecurrently revising all assessment tools, objectives, andoutcomes.

Substantiating Evidence:

Advanced Work

Mapped to:USA- SACS- Comprehensive Standards (Section3): 3.3.1.1

Measures & Findings

Specialized AreaProgram level; Direct - Other

Details/Description: Through supervised and evaluatedstudent teaching.

Target: All students will successfully complete studentteaching experience.

Implementation Plan (timeline):

Key/Responsible Personnel: John Burgess, GayeJeffers, Steve Ray, Mac Smotherman

Supporting Attachments:

Findings for Specialized Area

Summary of Findings: 1 Student is currently doingadvanced work.1 Student is progressing toward advanced work.3 Students show potential.

Target Achievement: Not Met

Recommendations :

Notes : The assessment and findings are based solelyupon our Directing class which has in the past been usedas a "capstone" class. The current faculty does notbelieve this assessment plan is an effective means ofassessment for our current objectives and goals. We are

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 5 8/6/2010 12:28 PM

Page 110: Completed Academic Program Assessment Plans 20082009

currently revising all assessment tools, objectives, andoutcomes.

Substantiating Evidence:

Specialized AreaProgram level; Direct - Other

Details/Description: Through supervised professionalinternship or international exchange.

Target: All students will score an average of 4.5 out of5 on questionnaire sent to host program.

Implementation Plan (timeline): Upon the completionof an internship or exchange program, a questionnairealong with a stamped envelope will be sent to the hostagency for the evaluation of the student.

Key/Responsible Personnel: John Burgess, GayeJeffers, Steve Ray, Mac Smotherman

Supporting Attachments:

Findings for Specialized Area

Summary of Findings: 2 Students completed aninternship with two different host institutions. Bothstudents scored 5/5

Target Achievement: Exceeded

Recommendations : We should raise our expectationsfor this outcome. Perhaps raise to 4.8 from 4.5.

Notes : The assessment and findings are based solelyupon our Directing class which has in the past been usedas a "capstone" class. The current faculty does notbelieve this assessment plan is an effective means ofassessment for our current objectives and goals. We arecurrently revising all assessment tools, objectives, andoutcomes.

Substantiating Evidence:

Specialized AreaProgram level; Direct - Other

Details/Description: Through direction of design of UTCTheatre or public workshop production.

Target: 100% of students will score a minimum of 9 outof 10 on 300r and 400r Jury Critiques

Implementation Plan (timeline):

Key/Responsible Personnel: John Burgess, GayeJeffers. Steve Ray, Mac Smotherman

Supporting Attachments:

Findings for Specialized Area

Summary of Findings: 1 Student is performing anadvanced work in directing during 2009-2010 year.1 Student is progressing toward advanced work intechnical theatre.3 Students show potential for advanced work.

Target Achievement: Not Met

Recommendations :

Notes : The assessment and findings are based solelyupon our Directing class which has in the past been usedas a "capstone" class. The current faculty does notbelieve this assessment plan is an effective means ofassessment for our current objectives and goals. We arecurrently revising all assessment tools, objectives, andoutcomes.

Substantiating Evidence:

Specialized AreaProgram level; Direct - Other

Details/Description: Through advanced production orperformance work juried by THSP faculty

Target: 100% of students will score a minimum of 9 outof 10 on 300r and 400r Jury Critiques

Implementation Plan (timeline): Twice each semester.

Key/Responsible Personnel: John Burgess, GayeJeffers, Steve Ray, Mac Smotherman

Supporting Attachments:

Findings for Specialized Area

Summary of Findings: All students are have scored 9 orhigher.

Target Achievement: Met

Recommendations :

Notes : The assessment and findings are based solelyupon our Directing class which has in the past been usedas a "capstone" class. The current faculty does notbelieve this assessment plan is an effective means ofassessment for our current objectives and goals. We are

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 5 8/6/2010 12:28 PM

Page 111: Completed Academic Program Assessment Plans 20082009

currently revising all assessment tools, objectives, andoutcomes.

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

5 of 5 8/6/2010 12:28 PM

Page 112: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Accountancy: MAcc

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Accountancy: MAcc Outcome Set

Outcomes

MACC Goal A1: Structured BusinessPresentation

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 5.1, Assurance of learning standards 5.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MACC A1: Structured Business PresentationCourse level; Direct - Exam

Details/Description: Following an in class assignmentregarding asset impairment, the students enrolled in547, Financial Accounting Theory and Issues, will go toEdgar Online or another financial statement databaseservice and identify a company that has had animpairment of long lived assets. In a subsequent class,and drawing upon the asset impairment informationgathered from their company’s 10K or 10Q, each studentwill make an eight to ten minute presentation on thedisclosures made in the body of the financial statementsas well as in the footnotes.

Target: Above + Meets Expectation is greater than 85%

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Thomas Gavin

Supporting Attachments:

Findings for MACC A1: Structured BusinessPresentation

Summary of Findings:

Overall:

Meets + Above Expectations 97.05%

Above Expectations 44.10%Meets Expectations 52.95%Below Expectations 2.95%

Traits:

1) Visual aids

Meets + Above Expectations: 100%

Above Expectations: 5.9%Meets Expectations: 94.1%Below Expectations: 0%

2) Eye Contact

Meets + Above Expectations: 94.1%

Above Expectations: 52.9%Meets Expectations: 41.2%Below Expectations: 5.9%

3) Elocution

Meets + Above Expectations: 100%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 4 8/6/2010 12:30 PM

Page 113: Completed Academic Program Assessment Plans 20082009

Above Expectations: 58.8%Meets Expectations: 41.2%Below Expectations: 0%

4) Mannerisms

Meets + Above Expectations: 94.1%

Above Expectations: 58.8%Meets Expectations: 35.3%Below Expectations: 5.9%

Target Achievement: Exceeded

Recommendations : None needed.

Notes :

Substantiating Evidence:

MACC Goal A2: Effective InformationGathering Techniques

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 5.1, Assurance of learning standards 5.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MACC A2:Effective Information GatheringTechniquesCourse level; Direct - Exam

Details/Description: Following an in class assignmentregarding asset impairment, the students enrolled in547, Financial Accounting Theory and Issues, will go toEdgar Online or another financial statement databaseservice and identify a company that has had animpairment of long lived assets. In a subsequent class,and drawing upon the asset impairment informationgathered from their company’s 10K or 10Q, each studentwill make an eight to the minute presentation on thedisclosures made in the body of the financial statementsas well as in the footnotes.

Target: Above + Meets Expectation is greater than 85%

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Thomas Gavin

Supporting Attachments:

Findings for MACC A2:Effective InformationGathering Techniques

Summary of Findings:Meets + Above Expectations 100%

Above Expectations 70.6%Meets Expectations 29.4%Below Expectations 0%

Target Achievement: Exceeded

Recommendations : None needed.

Notes :

Substantiating Evidence:

MACC Goal C2: Implementing AdvancedLaws, Regulations, and S

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 5.1, Assurance of learning standards 5.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MACC C2: Implementing Advanced Laws,Regulations, and StandardsCourse level; Direct - Exam

Details/Description: All students enrolled in 547,Financial Accounting Theory and Issues, were given acase regarding SFAS 144, Accounting for the Impairmentor Disposal of Long-Lived Assets. Students were askedto develop expected cash flow outcome given cash flowsassociated with the following operational elements: net

Findings for MACC C2: Implementing AdvancedLaws, Regulations, and Standards

Summary of Findings:Meets + Above Expectations 92.65%

Above Expectations 89.7%Meets Expectations 2.95%Below Expectations 7.35%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 4 8/6/2010 12:30 PM

Page 114: Completed Academic Program Assessment Plans 20082009

cash flows for the asset in-use and for sale, related timehorizons, and related discounts rates (to be used asappropriate).

Target: Above + Meets Expectation is greater than 85%

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Thomas Gavin

Supporting Attachments:

Target Achievement: Exceeded

Recommendations : None needed.

Notes :

Substantiating Evidence:

MACC Goal D1: Consequences ofDeviating from Ethically Sound

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 5.1, Assurance of learning standards 5.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MACC D1: Consequences of Deviating fromEthically Sound Decision-MakingCourse level; Direct - Exam

Details/Description: In BACC 589, students prepared acase write-up, which had a general overview and thenanswer posted questions regarding ethics. The fall classanalyzed Accounting Fraud at WorldCom and the springclass analyzed Ethics and Competence at New CenturyFinancial Corporation.

Target: Above + Meets Expectation is greater than 85%

Implementation Plan (timeline): Spring 2009

Key/Responsible Personnel: Melanie McCoskey

Supporting Attachments:

Findings for MACC D1: Consequences of Deviatingfrom Ethically Sound Decision-Making

Summary of Findings:Meets + Above Expectations 76.47%

Above Expectations 41.18%Meets Expectations 35.29%Below Expectations 23.53%

Target Achievement: Not Met

Recommendations : Students didn't really understandhow the stakeholders fit into the decision-makingprocess. McCoskey will tell them to analyze this issuemore fully in future semesters.

Notes :

Substantiating Evidence:

MACC Goal E2:

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 5.1, Assurance of learning standards 5.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MACC E2: Knowledge of Advanced TechnicalTopicsCourse level; Direct - Exam

Details/Description: Students in BACC 536 AccountingInformation Systems, Summer term 2009, completed 12cases using Excel 2007. Twenty five students wereevaluated. The metric used was the following: less than90% correct was below expectations, 90%-95% metexpectations, and 96%-100% exceeded expectations.Since these were graduate students who were skilled inExcel, the performance standard was high.

Target: Above + Meets Expectation is greater than 85%

Implementation Plan (timeline): Summer 2009

Key/Responsible Personnel: Marsha Scheidt

Supporting Attachments:

Findings for MACC E2: Knowledge of AdvancedTechnical Topics

Summary of Findings: In the 8 cases that requiredbasic skills, over 60% of the students exceededexpectation.

In two of the 4 cases that required advanced skills, over40% of the students scored below expectations.

Target Achievement: Not Met

Recommendations : Whereas students were able tohandle cases using basic Excel skills, they were notexperienced with its advanced features. Marsha Scheidtwill endeavor to improve their skills using twoapproaches. First, she will incorporate Excel 2007 casesinto my undergraduate BACC 408 course to target these4 areas. She will develop these cases to improve these4 identified advanced skills. It is her objective that this

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 4 8/6/2010 12:30 PM

Page 115: Completed Academic Program Assessment Plans 20082009

knowledge should carry forward to their graduate BACC536 course. Second, in BACC 536, she will prepare fourshort demonstrations using these advanced features.Hopefully, these short lectures will be a refresher in howto apply these advanced skills.

Notes :

Substantiating Evidence:

MACC Goal E3: Accounting in BusinessOrganizations

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 5.1, Assurance of learning standards 5.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MACC E3: Understanding Accounting inBusiness OrganizationsCourse level; Direct - Exam

Details/Description: Students in BACC 536 AccountingInformation Systems, Summer term 2009, completed 10exam questions about the strategic role of accountingand information in business organizations. Thisquestionnaire was based on a reading concerning thestrategic goals of an organization. Twenty five studentswere evaluated. The metric used was the following: lessthan 50% correct was below expectations, 50-75% metexpectations, and over 75% exceeded expectations.

Target: Above + Meets Expectation is greater than 85%

Implementation Plan (timeline): Summer 2009

Key/Responsible Personnel: Marsha Scheidt

Supporting Attachments:

Findings for MACC E3: Understanding Accountingin Business Organizations

Summary of Findings: Students were successful indifferentiating between structured and unstructureddecisions, decision scope, and value chain components.On these questions, students consistently scored over80%. On questions involving strategic differentiationthey scored below 70%.

Target Achievement: Not Met

Recommendations : Students performed poorly onquestions involving variety-based, needs-based andaccess based strategic positions. She will prepare alecture on these topics and present a more in-depthdiscussion of these topics. She will also endeavor to linkthese concepts to more concrete examples.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 4 8/6/2010 12:30 PM

Page 116: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Business Administration: Management & Marketing

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Business Administration:Undergraduate Outcome Set

Outcomes

BS Goal A1: Written Communication

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS A1: Written CommunicationCourse level; Direct - Other

Details/Description: The BMGT 310, BusinessCommunication, course requires students to composeseveral written documents. The College of Businessused one of the written assignments for theundergraduate degree program assessment. Thestudents completed the program assessmentassignment outside of class. The assignment directedthe student to compose a message to employees of acompany announcing the return to a formal dress code.Using a rubric, the professors evaluated the followingfour traits of writing.

1. logic and organization2. language3. spelling and grammar4. purpose.

Target: Meets + expectations are equal to or greaterthan 80%.

Implementation Plan (timeline): Spring 2009

Key/Responsible Personnel: Cindy White

Supporting Attachments:

Findings for BS A1: Written Communication

Summary of Findings:

Overall:

Meets + Above Expectations 90.42%

Below Expectations 45.74%Meets Expectations 44.68%Above Expectations 9.58%

Traits:

1) Logic and organization

Meets + Above Expectations: 93.61%

Above Expectations: 42.55%Meets Expectations: 51.06%Below Expectations: 6.39%

2) Language

Meets + Above Expectations: 89.36%

Above Expectations: 41.49%Meets Expectations: 47.87%Below Expectations: 10.64%

3) Spelling and grammar

Meets + Above Expectations: 87.23%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 12 8/6/2010 12:58 PM

Page 117: Completed Academic Program Assessment Plans 20082009

Above Expectations: 43.62%Meets Expectations: 43.61%Below Expectations: 12.77%

4) Purpose

Meets + Above Expectations: 90.43%

Above Expectations: 42.56%Meets Expectations: 47.87%Below Expectations: 9.57%

Target Achievement: Exceeded

Recommendations : No changes.

Notes : The overall results from Spring 2008 were 77.4%met or exceeded expectations. As a result, the overallSpring 2009 results of 90.42% is a great improvement.

Substantiating Evidence:

BS Goal A2: Business Presentation Skills

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS A2: Business Presentation SkillsCourse level; Direct - Other

Details/Description: BMGT 310 students, working inteams, participated in a role-playing exercise. A problemscenario presented a situation where the students madea decision and presented their decision to a board ofdirectors (the BMGT 310 class). Each student spoke fortwo to three minutes. This role-playing presentationconstituted the business presentation assessment. Theprofessors used a rubric to evaluate the following fivepresentation traits.

1. organization2. visual aids3. eye contact4. elocution5. mannerisms.

Target: Meets + expectations are equal to or greaterthan 80%.

Implementation Plan (timeline): Spring 2009

Key/Responsible Personnel: Cindy White

Supporting Attachments:

Findings for BS A2: Business Presentation Skills

Summary of Findings:

Overall:

Meets + Above Expectations 95%

Below Expectations 46%Meets Expectations 49%Above Expectations 5%

Traits:

1) Organization

Meets + Above Expectations: 94%

Above Expectations: 58%Meets Expectations: 36%Below Expectations: 6%

2) Visual aids

Meets + Above Expectations: 94%

Above Expectations: 53%Meets Expectations: 41%Below Expectations: 6%

3) Eye Contact

Meets + Above Expectations: 75%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 12 8/6/2010 12:58 PM

Page 118: Completed Academic Program Assessment Plans 20082009

Above Expectations: 33%Meets Expectations: 42%Below Expectations: 25%

4) Elocution

Meets + Above Expectations: 91%

Above Expectations: 35%Meets Expectations: 56%Below Expectations: 9%

5) Mannerisms

Meets + Above Expectations: 74%

Above Expectations: 28%Meets Expectations: 46%Below Expectations: 26%

Target Achievement: Exceeded

Recommendations : Overall, no changes arerecommended. In the traits, the eye contact andmannerism traits need improvement.

Notes :

Substantiating Evidence:

BS Goal B: Ability to reason ethically.

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS B EthicsCourse level; Direct - Other

Details/Description: Students in the BACC 335 class arerequired to read and analyze a case with ethical issues.They are told to complete the following task.“Thoroughly discuss the legal and ethical issues thatarise in this scenario. In your discussion, offer potentialsolutions where appropriate. Explain your answercompletely.” The students are evaluated on each of thefollowing areas and an overall ethics evaluation.

Trait 1: Follows Assignment Instructions

Trait 2: Identifies the issue of whether it was ethical forJean Claude to offer business opportunities to theshipper and insurer in exchange for donations.

Trait 3: Identifies issues regarding whether it wasethical for the museum to alter its exhibition format tomeet the demands of the owner of the pieces or to meetthe demands of a donor.

Trait 4: Identifies the issues regarding whether adonation by a company in exchange for a contract to

Findings for BS B Ethics

Summary of Findings:

Overall Ethics Assessment:

Meets + Above Expectations 91.23%

Above Expectations 47.37%Meets Expectations 43.86%Below Expectations 8.77%

Traits:

1) Follows Assignment Instructions

Meets + Above Expectations: 94.74%

Above Expectations: 66.67%Meets Expectations: 28.07%Below Expectations: 5.26%

2) Identifies the issue of whether it was ethical for JeanClaude to offer business opportunities to the shipper

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 12 8/6/2010 12:58 PM

Page 119: Completed Academic Program Assessment Plans 20082009

ship or insure property is really a charitable contributionand whether a "donor" making such a gift can thecompany claim a tax deduction.

Trait 5: Identifies whether Jane violated her code ofethics and compromised the position of the museum,and if so, how?

Trait 6: Identifies legal and ethical issues that couldlimit enforceability of the pledges to the Museumagainst donors in this case.

Trait 7: Identifies ethical issues related to Jean Claude'ssuggestions or demands that the museum exhibit hispieces in a certain way for is own personal interest.

Trait 8: Offers viable solutions to ethical problems.

Target: Meets + expectations are equal to or greaterthan 80%.

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Joanie Sompayrac

Supporting Attachments:

and insurer in exchange for donations.

Meets + Above Expectations: 95.61%

Above Expectations: 79.82%Meets Expectations: 15.79%Below Expectations: 4.39%

3) Identifies issues regarding whether it was ethical forthe museum to alter its exhibition format to meet thedemands of the owner of the pieces or to meet thedemands of a donor.

Meets + Above Expectations: 50%

Above Expectations: 21.05%Meets Expectations: 28.95%Below Expectations: 50%

4) Identifies the issues regarding whether a donation bya company in exchange for a contract to ship or insureproperty is really a charitable contribution and whether a"donor" making such a gift can the company claim a taxdeduction.

Meets + Above Expectations: 63.16%

Above Expectations: 35.09%Meets Expectations: 28.07%Below Expectations: 36.84%

5) Identifies whether Jane violated her code of ethicsand compromised the position of the museum, and if so,how?

Meets + Above Expectations: 89.47%

Above Expectations: 64.91%Meets Expectations: 24.56%Below Expectations: 10.53%

6) Identifies legal and ethical issues that could limitenforceability of the pledges to the Museum againstdonors in this case.

Meets + Above Expectations: 92.98%

Above Expectations: 39.47%Meets Expectations: 53.51%Below Expectations: 7.02%

7) : Identifies ethical issues related to Jean Claude'ssuggestions or demands that the museum exhibit hispieces in a certain way for is own personal interest.

Meets + Above Expectations: 55.26%

Above Expectations: 20.18%Meets Expectations: 35.08%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 12 8/6/2010 12:58 PM

Page 120: Completed Academic Program Assessment Plans 20082009

Below Expectations: 44.74%

8) Offers viable solutions to ethical problems

Meets + Above Expectations: 79.82%

Above Expectations: 28.07%Meets Expectations: 51.75%Below Expectations: 20.18%

Target Achievement: Exceeded

Recommendations : Overall, no changes arerecommended. Traits 3 and 4 need improvement.

Notes :

Substantiating Evidence:

BS Goal C: Analytical, critical-thinkingability.

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS C: Critical ThinkingCourse level; Direct - Other

Details/Description: A major component of BMGT 440 isthe Business Strategy Game simulation. Students areasked to take on the role of CEO of a sneakermanufacturing company. They compete against theircolleagues. The entire simulation revolves around criticalthinking. Students must be able to analyze their owncompany, understand their competition, and makedecisions that will move them forward in aninternational marketplace. Students are scored on howwell they meet their investor’s expectations and on howthey fare based on “best-in-industry.” The scoring is a50%-50% weighting.

Target: Meets + expectations are equal to or greaterthan 80%.

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Kathleen Wheatley

Supporting Attachments:

Findings for BS C: Critical Thinking

Summary of Findings:

Overall:

Above + Meets Expectations: 81.1%

Above Expectations: 49.3%Meets Expectations: 31.8%Below Expectations: 18.9%

Target Achievement: Met

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

BS Goal D: Use of InformationTechnology

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS D: Use of Information TechnologyCourse level; Direct - Exam

Details/Description: Students gain an informationtechnology proficiency in BMGT 100, Computers inBusiness. BMGT 100 focuses on four areas of MicrosoftOffice: Word, Excel, PowerPoint, and Access. In each ofthese areas, students in BMGT 100 take a testconsisting of a number of tasks.

For the College of Business assessment of UG Objective

Findings for BS D: Use of Information Technology

Summary of Findings: The BMGT 100 Computers inBusiness Course was being redesigned to an onlinecourse during the spring of 2009. Consequently, noassessment was conducted.

Target Achievement:

Recommendations : We are developing a newassessment for the new course.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

5 of 12 8/6/2010 12:58 PM

Page 121: Completed Academic Program Assessment Plans 20082009

D, the coordinator will embed a series of tasks into eachtest in all sections. The tasks are computerized and thecoordinator will have the data from all sections. Thenumber of embedded tasks for each test is below.

Word 6 tasksExcel 8 tasksPowerPoint 6 tasksAccess 6 tasks

The rubric gives the student assessment scores for eachtest, and the following pages describe the assessmentsfor Word, Excel, PowerPoint, and Access.

Target: Meets + expectations are equal to or greaterthan 80%.

Implementation Plan (timeline): Spring 2009

Key/Responsible Personnel: Luis Leon and TonyParsley

Supporting Attachments:

Notes :

Substantiating Evidence:

BS Goal E1: Diversity

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS E1: DiversityCourse level; Direct - Exam

Details/Description: A free-form essay was be includedat the end of the exam. The question for the exam was“When building a department, team, or any other groupof employees, managers have become more dedicated toseeking a diverse set of employees because they realizethere are distinct advantages to doing so. Please definediversity and explain these advantages.”

Target: Meets + expectations are equal to or greaterthan 80%.

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Jennifer Stanley

Supporting Attachments:

Findings for BS E1: Diversity

Summary of Findings:

Overall:

Meets + Above Expectations: 71.8%

Above Expectations: 42.4%Meets Expectations: 29.4%Below Expectations: 28.2%

Target Achievement: Not Met

Recommendations : More emphasis on diversity needsto be included in the curriculum.

Notes :

Substantiating Evidence:

BS Goal E2: Stages of GroupDevelopment

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section

Measures & Findings

BS E2: Stages of Group DevelopmentCourse level; Direct - Exam

Details/Description: Six questions on group/teamdevelopment were embedded into the final exam.

Findings for BS E2: Stages of Group Development

Summary of Findings:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

6 of 12 8/6/2010 12:58 PM

Page 122: Completed Academic Program Assessment Plans 20082009

3): 3.3.1, 3.3.1.1 Target: Meets + expectations are equal to or greaterthan 80%.

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Jennifer Stanley

Supporting Attachments:

Overall:

Meets + Above Expectations: 90.3%

Above Expectations: 45.4%Meets Expectations: 44.9%Below Expectations: 9.7%

Question Results:Q1 - 73.3% correctQ2 - 80.7% correctQ3 - 79.0% correctQ4 - 77.8% correctQ5 - 80.1% correctQ6 - 23.9% correct

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

BS Goal E3: Group Collaboration

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS E3: Group CollaborationCourse level; Direct - Other

Details/Description: Students in BMGT 440 participatedin teams and competed in a business simulation game.Working in the teams, the students were required togain an understanding of the game’s industry byconducting an Industry Analysis. They analyzedeconomic factors, industry trends, industry shocks, andstrategic groups. The teams presented the IndustryAnalysis to the class. The teammates evaluated each ofthe other team members on the student’s contributionto the Industry Presentation and Industry Analysis.

The College used the teammate evaluation of each teammember for the assessment of objective E3, groupcollaboration. Every member of the team rated eachstudent on five group collaboration traits. The five traitsare below.

AttendancePreparation for meetingsEnthusiasm and commitmentTeamwork and cooperativenessCarried fair share of work load

Target: Meets + expectations are equal to or greaterthan 80%.

Implementation Plan (timeline): Spring 2009

Key/Responsible Personnel: Kathleen Wheatley

Supporting Attachments:

Findings for BS E3: Group Collaboration

Summary of Findings:Overall:

Meets + Above Expectations 99.25%

Below Expectations 91.04%Meets Expectations 8.21%Above Expectations 0.75%

Traits:

1) Attendance

Meets + Above Expectations: 99.25%

Above Expectations: 94.78%Meets Expectations: 4.47%Below Expectations: 0.75%

2) Preparation for Meetings

Meets + Above Expectations: 99.25%

Above Expectations: 82.09%Meets Expectations: 17.16%Below Expectations: 0.75%

3) Enthusiasm and commitment

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

7 of 12 8/6/2010 12:58 PM

Page 123: Completed Academic Program Assessment Plans 20082009

Meets + Above Expectations: 99.25%

Above Expectations: 85.82%Meets Expectations: 13.43%Below Expectations: 0.75%

4) Teamwork and cooperativeness

Meets + Above Expectations: 100%

Above Expectations: 91.79%Meets Expectations: 8.21%Below Expectations: 0%

5) Carried fair share of work load

Meets + Above Expectations: 99.25%

Above Expectations: 56.71%Meets Expectations: 42.54%Below Expectations: 0.75%

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes : Overall students improved over their Spring2008 scores.

Substantiating Evidence:

BS Goal F1: Accounting

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS F1: AccountingCourse level; Direct - Exam

Details/Description: The Educational Testing Service’s(ETS) Major Field Test (MFT) in Business assessed allnine objectives of UG Goal F. The test consisted of 120multiple-choice questions.

The College utilized the MFT in Business to assess theobjectives of UG Goal F. The test was a permanent partof all BMGT 440 classes with a weight of ten percent ofthe student’s grade. The scores, compared to nationaltest scores, ranked COB students.

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008, Spring2009, and Summer 2009

Key/Responsible Personnel: Kathleen Wheatley,

Findings for BS F1: Accounting

Summary of Findings:65th percentile for Summer 2009.65th percentile for Summer 2009.40th percentile for Spring 2009.

Target Achievement: Met

Recommendations : We need to improve theconsistency of student results.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

8 of 12 8/6/2010 12:58 PM

Page 124: Completed Academic Program Assessment Plans 20082009

Charles Ragland, and Bob Koplowski

Supporting Attachments:

BS Goal F2: Economics

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS F2: EconomicsCourse level; Direct - Exam

Details/Description: The Educational Testing Service’s(ETS) Major Field Test (MFT) in Business assessed allnine objectives of UG Goal F. The test consisted of 120multiple-choice questions.

The College utilized the MFT in Business to assess theobjectives of UG Goal F. The test was a permanent partof all BMGT 440 classes with a weight of ten percent ofthe student’s grade. The scores, compared to nationaltest scores, ranked COB students.

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008, Spring2009, and Summer 2009

Key/Responsible Personnel: Kathleen Wheatley,Charles Ragland, and Bob Koplowski

Supporting Attachments:

Findings for BS F2: Economics

Summary of Findings:50th percentile for Fall 2008.55th percentile for Spring 2009.20th percentile for Summer 2009.

Target Achievement: Met

Recommendations : We need to improve theconsistency of student results.

Notes :

Substantiating Evidence:

BS Goal F3: Management

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS F3: ManagementCourse level; Direct - Exam

Details/Description: The Educational Testing Service’s(ETS) Major Field Test (MFT) in Business assessed allnine objectives of UG Goal F. The test consisted of 120multiple-choice questions.

The College utilized the MFT in Business to assess theobjectives of UG Goal F. The test was a permanent partof all BMGT 440 classes with a weight of ten percent ofthe student’s grade. The scores, compared to nationaltest scores, ranked COB students.

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008, Spring2009, and Summer 2009

Key/Responsible Personnel: Kathleen Wheatley,Charles Ragland, and Bob Koplowski

Supporting Attachments:

Findings for BS F3: Management

Summary of Findings:60th percentile for Fall 2008.60th percentile for Spring 2009.75th percentile for Summer 2009.

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

9 of 12 8/6/2010 12:58 PM

Page 125: Completed Academic Program Assessment Plans 20082009

BS Goal F4: Quantitative businessanalysis

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS F4: Quantitative Business AnalysisCourse level; Direct - Exam

Details/Description: The Educational Testing Service’s(ETS) Major Field Test (MFT) in Business assessed allnine objectives of UG Goal F. The test consisted of 120multiple-choice questions.

The College utilized the MFT in Business to assess theobjectives of UG Goal F. The test was a permanent partof all BMGT 440 classes with a weight of ten percent ofthe student’s grade. The scores, compared to nationaltest scores, ranked COB students.

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008, Spring2009, and Summer 2009

Key/Responsible Personnel: Kathleen Wheatley,Charles Ragland, and Bob Koplowski

Supporting Attachments:

Findings for BS F4: Quantitative Business Analysis

Summary of Findings:45th percentile for Fall 2008.45th percentile for Spring 2009.60th percentile for Summer 2009.

Target Achievement: Met

Recommendations : We need to improve theconsistency of student results.

Notes :

Substantiating Evidence:

BS Goal F5: Information systems

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS F5: Information SystemsCourse level; Direct - Exam

Details/Description: The Educational Testing Service’s(ETS) Major Field Test (MFT) in Business assessed allnine objectives of UG Goal F. The test consisted of 120multiple-choice questions.

The College utilized the MFT in Business to assess theobjectives of UG Goal F. The test was a permanent partof all BMGT 440 classes with a weight of ten percent ofthe student’s grade. The scores, compared to nationaltest scores, ranked COB students.

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008, Spring2009, and Summer 2009

Key/Responsible Personnel: Kathleen Wheatley,Charles Ragland, and Bob Koplowski

Supporting Attachments:

Findings for BS F5: Information Systems

Summary of Findings:50th percentile for Fall 2008.50th percentile for Spring 2009.15th percentile for Summer 2009.

Target Achievement: Met

Recommendations : We need to improve theconsistency of student results.

Notes :

Substantiating Evidence:

BS Goal F6: Finance

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,

Measures & Findings

BS F6: FinanceCourse level; Direct - Exam

Findings for BS F6: Finance

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

10 of 12 8/6/2010 12:58 PM

Page 126: Completed Academic Program Assessment Plans 20082009

USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1 Details/Description: The Educational Testing Service’s

(ETS) Major Field Test (MFT) in Business assessed allnine objectives of UG Goal F. The test consisted of 120multiple-choice questions.

The College utilized the MFT in Business to assess theobjectives of UG Goal F. The test was a permanent partof all BMGT 440 classes with a weight of ten percent ofthe student’s grade. The scores, compared to nationaltest scores, ranked COB students.

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008, Spring2009, and Summer 2009

Key/Responsible Personnel: Kathleen Wheatley,Charles Ragland, and Bob Koplowski

Supporting Attachments:

Summary of Findings:60th percentile for Fall 2008.70th percentile for Spring 2009.70th percentile for Summer 2009.

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

BS Goal F7: Marketing

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS F7: MarketingCourse level; Direct - Exam

Details/Description: The Educational Testing Service’s(ETS) Major Field Test (MFT) in Business assessed allnine objectives of UG Goal F. The test consisted of 120multiple-choice questions.

The College utilized the MFT in Business to assess theobjectives of UG Goal F. The test was a permanent partof all BMGT 440 classes with a weight of ten percent ofthe student’s grade. The scores, compared to nationaltest scores, ranked COB students.

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008, Spring2009, and Summer 2009

Key/Responsible Personnel: Kathleen Wheatley,Charles Ragland, and Bob Koplowski

Supporting Attachments:

Findings for BS F7: Marketing

Summary of Findings:40th percentile for Fall 2008.60th percentile for Summer 2009.40th percentile for Spring 2009.

Target Achievement: Met

Recommendations : We need to improve theconsistency of student results.

Notes :

Substantiating Evidence:

BS Goal F8: Legal issues

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS F8: Legal IssuesCourse level; Direct - Exam

Details/Description: The Educational Testing Service’s(ETS) Major Field Test (MFT) in Business assessed allnine objectives of UG Goal F. The test consisted of 120multiple-choice questions.

Findings for BS F8: Legal Issues

Summary of Findings:70th percentile for Fall 2008.80th percentile for Summer 2009.40th percentile for Spring 2009.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

11 of 12 8/6/2010 12:58 PM

Page 127: Completed Academic Program Assessment Plans 20082009

The College utilized the MFT in Business to assess theobjectives of UG Goal F. The test was a permanent partof all BMGT 440 classes with a weight of ten percent ofthe student’s grade. The scores, compared to nationaltest scores, ranked COB students.

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008, Spring2009, and Summer 2009

Key/Responsible Personnel: Kathleen Wheatley,Charles Ragland, and Bob Koplowski

Supporting Attachments:

Target Achievement: Met

Recommendations : We need to improve theconsistency of student results.

Notes :

Substantiating Evidence:

BS Goal F9: International issues

Mapped to:USA- AACSB- Standards: Assurance of LearningStandards 1.1,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

BS F9: International IssuesCourse level; Direct - Exam

Details/Description: The Educational Testing Service’s(ETS) Major Field Test (MFT) in Business assessed allnine objectives of UG Goal F. The test consisted of 120multiple-choice questions.

The College utilized the MFT in Business to assess theobjectives of UG Goal F. The test was a permanent partof all BMGT 440 classes with a weight of ten percent ofthe student’s grade. The scores, compared to nationaltest scores, ranked COB students.

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008, Spring2009, and Summer 2009

Key/Responsible Personnel: Kathleen Wheatley,Charles Ragland, and Bob Koplowski

Supporting Attachments:

Findings for BS F9: International Issues

Summary of Findings:50th percentile for Fall 2008.55th percentile for Summer 2009.45th percentile for Spring 2009.

Target Achievement: Met

Recommendations : We need to improve theconsistency of student results.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

12 of 12 8/6/2010 12:58 PM

Page 128: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Business Administration: MBA

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Business Administration: MBAOutcome Set

Outcomes

MBA Goal A1: Group Collaboration

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA A1: Group CollaborationCourse level; Direct - Other

Details/Description: Students in BMGT 584,Management Skills, work in teams to develop theirmanagement skills. For the final exam, each teamcompletes a semester project in which they developtheir own realistic, management scenario and then videotape themselves applying the management skillslearned in BMGT 584.

The team members also rate each other’s contribution tothe team effort in terms of attendance, participation,effort, work quality, and interpersonal behavior. TheCollege of Business will use this evaluation for programassessment of MBA Objective A1, group collaboration.The evaluation form and rubric for the COB assessmentfollow.

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Spring 2009

Key/Responsible Personnel: Rich Allen

Supporting Attachments:

Findings for MBA A1: Group Collaboration

Summary of Findings:

Overall:

Meets + Expectations: 100%

Above Expectations: 97.67%Meets Expectations: 2.33%Below Expectations: 0%

Traits:

1) Attendance

Meets + Above Expectations: 100%

Above Expectations: 100%Meets Expectations: 0%Below Expectations: 0%

2) Participation

Meets + Above Expectations: 100%

Above Expectations: 97.67%Meets Expectations: 2.33%Below Expectations: 0%

3) Effort

Meets + Above Expectations: 100%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 18 8/6/2010 1:00 PM

Page 129: Completed Academic Program Assessment Plans 20082009

Above Expectations: 97.67%Meets Expectations: 2.33%Below Expectations: 0%

4) Work Quality

Meets + Above Expectations: 100%

Above Expectations: 97.67%Meets Expectations: 2.33%Below Expectations: 0%

5) Interpersonal behavior

Meets + Above Expectations: 100%

Above Expectations: 95.35%Meets Expectations: 4.65%Below Expectations: 0%

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

MBA Goal A2: Leadership

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA A2: LeadershipCourse level; Direct - Other

Details/Description: Students work in small groups(usually three) on role-playing in order to developexperience in critical management skills. Two of thestudents in the group play the roles, while the thirdstudent evaluates the appropriate behavior. Eachstudent receives feedback on how she or he applied atleast one of the four skills listed below during arole-play skill practice to a realistic managementscenario. The description and rating guidelines for eachof the four skills follow.

Acting assertively - Sometimes individuals attempt toincrease their power over other individuals by exercisinginappropriate influence. This role playing exercise placesthe student in the role of the person needing to resistthe influence. The students plan their resistancestrategy and are assessed on nine areas. Theeffectiveness of the resistance is evaluated by the thirdmember of the group.

Communicating supportively - Effective one-on-onecoaching and counseling are important skills for leaders.This exercise places the student in a situation wherethey must give advice to a subordinate in order toimprove the subordinate’s performance. The student

Findings for MBA A2: Leadership

Summary of Findings:ACTING ASSERTIVELY OVERALL:

Meets + Expectations: 100%

Above Expectations: 68%Meets Expectations: 32%Below Expectations: 0%

Traits:

1) Explain the adverse effects of compliance onperformance.

Meets + Above Expectations: 88%

Above Expectations: 40%Meets Expectations: 48%Below Expectations: 12%

2) Defend your personal rights.

Meets + Above Expectations: 92%

Above Expectations: 72%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 18 8/6/2010 1:00 PM

Page 130: Completed Academic Program Assessment Plans 20082009

counseling another role-player is assessed by the thirdmember of the group on ten behaviors for effectivesupportive communication.

Managing conflict – An essential leadership behavior isthe ability to manage conflict. In this managing conflictrole-playing exercise, the responder and the initiator areassessed by the observer resulting in two separateassessments. The observer assesses the responder onsix guidelines and assesses the initiator on sevenguidelines

Motivating others – The ability to motivate people is avital leadership skill for managers. This role-playingexercise requires a manager to motivate an employee tochange an inappropriate behavior into an appropriatebehavior. The observer assesses the manager on theeight guidelines.

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Rich Allen

Supporting Attachments:

Meets Expectations: 20%Below Expectations: 8%

3) Firmly refuse to comply.

Meets + Above Expectations: 96%

Above Expectations: 76%Meets Expectations: 20%Below Expectations: 4%

4) Examine the intent of any gift or favor.

Meets + Above Expectations: 76%

Above Expectations: 52%Meets Expectations: 24%Below Expectations: 24%

5) Confront the manipulator.

Meets + Above Expectations: 96%

Above Expectations: 68%Meets Expectations: 28%Below Expectations: 4%

6) Refuse to bargain.

Meets + Above Expectations: 96%

Above Expectations: 56%Meets Expectations: 40%Below Expectations: 4%

7) Use countervailing power to shift dependence tointerdependence.

Meets + Above Expectations: 84%

Above Expectations: 44%Meets Expectations: 40%Below Expectations: 16%

8) Confront the exploiting individual directly.

Meets + Above Expectations: 88%Above Expectations: 72%Meets Expectations: 16%Below Expectations: 12%

9) Explain the adverse effects of compliance onperformance.

Meets + Above Expectations: 96%

Above Expectations: 76%Meets Expectations: 20%Below Expectations: 4%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 18 8/6/2010 1:00 PM

Page 131: Completed Academic Program Assessment Plans 20082009

COMMUNICATING SUPPORTIVELY OVERALL:

Meets + Expectations: 100%

Above Expectations: 52.9%Meets Expectations: 47.1%Below Expectations: 0%

Traits:

1) Differentiate between coaching and counseling.

Meets + Above Expectations: 94.1%Above Expectations: 58.8%Meets Expectations: 35.3%Below Expectations: 5.9%

2) Communicate congruently.

Meets + Above Expectations: 100%

Above Expectations: 76.5%Meets Expectations: 23.5%Below Expectations: 0%

3) Use descriptive, not evaluative statements.

Meets + Above Expectations: 88.2%

Above Expectations: 64.7%Meets Expectations: 23.5%Below Expectations: 11.8%

4) Use problem-oriented, not person-orientedstatements.

Meets + Above Expectations: 94.2%

Above Expectations: 47.1%Meets Expectations: 47.1%Below Expectations: 5.8%

5) Use validating statements.

Meets + Above Expectations: 88.2%

Above Expectations: 64.7%Meets Expectations: 23.5%Below Expectations: 11.8%

6) Use specific, not global statements.

Meets + Above Expectations: 82.3%

Above Expectations: 52.9%Meets Expectations: 29.4%Below Expectations: 17.7%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 18 8/6/2010 1:00 PM

Page 132: Completed Academic Program Assessment Plans 20082009

7) Use conjunctive, not disjunctive statements.

Meets + Above Expectations: 100%

Above Expectations: 58.8%Meets Expectations: 41.2%Below Expectations: 0%

8) Own your statements.

Meets + Above Expectations: 82.3%

Above Expectations: 58.8%Meets Expectations: 23.5%Below Expectations: 17.7%

9) Use supportive listening and appropriate responses toinsure 2 way conversation.

Meets + Above Expectations: 94.1%

Above Expectations: 64.7%Meets Expectations: 29.4%Below Expectations: 5.9%

10) Implement Personal Management Interviews

Meets + Above Expectations: 82.3%

Above Expectations: 64.7%Meets Expectations: 17.6%Below Expectations: 17.7%

MANAGING CONFLICT (RESPONDER) OVERALL:

Meets + Expectations: 85.7%

Above Expectations: 64.3%Meets Expectations: 21.4%Below Expectations: 14.3%

Traits:

1) Establish a climate for joint problem solving.

Meets + Above Expectations: 100%

Above Expectations: 78.6%Meets Expectations: 21.4%Below Expectations: 0%

2) Seek additional information.

Meets + Above Expectations: 78.6%

Above Expectations: 57.2%Meets Expectations: 21.4%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

5 of 18 8/6/2010 1:00 PM

Page 133: Completed Academic Program Assessment Plans 20082009

Below Expectations: 21.4%

3) Agree with some aspect of the complaint.

Meets + Above Expectations: 92.9%

Above Expectations: 64.3%Meets Expectations: 28.6%Below Expectations: 7.1%

4) Ask for suggestions of acceptable alternatives.

Meets + Above Expectations: 85.7%

Above Expectations: 64.3%Meets Expectations: 21.4%Below Expectations: 14.3%

5) Create an action plan.

Meets + Above Expectations: 100%

Above Expectations: 57.1%Meets Expectations: 42.9%Below Expectations: 0%

6) Schedule follow-up

Meets + Above Expectations: 85.7%Above Expectations: 71.4%Meets Expectations: 14.3%Below Expectations: 14.3%

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

MBA Goal B1: Written Communication

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA B1: Written CommunicationCourse level; Direct - Other

Details/Description: The assessment for the writingobjective was an in-class assessment. The studentsresponded in writing to the following questions. “Whatwas the toughest ethical decision you have faced? Howdid you handle it and why? What did you learn?” Usingthe three-point scale, the professor assessed thestudents and gave an overall writing score and scores onthe following four writing traits.

Writing TraitsIdeas and contentOrganization

Findings for MBA B1: Written Communication

Summary of Findings:

Overall:

Meets + Expectations: 100%

Above Expectations: 93.75%Meets Expectations: 6.25%Below Expectations: 0%

Traits:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

6 of 18 8/6/2010 1:00 PM

Page 134: Completed Academic Program Assessment Plans 20082009

Language and voicePresentation

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Spring 2009

Key/Responsible Personnel: Bev Brockman

Supporting Attachments:

1) Ideas and content

Meets + Above Expectations: 100%

Above Expectations: 81.25%Meets Expectations: 18.75%Below Expectations: 0%

2) Organization

Meets + Above Expectations: 100%

Above Expectations: 81.25%Meets Expectations: 18.75%Below Expectations: 0%

3) Language and voice

Meets + Above Expectations: 93.75%

Above Expectations: 25%Meets Expectations: 68.75%Below Expectations: 6.25%

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes : The language (grammar) and voice (tense) traithad 28 percent of the students scoring belowexpectations. To evaluate these traits better, the courseprofessor recommended separating two areas intodifferent traits. The AOL Committee agreed andapproved the action.

Substantiating Evidence:

MBA Goal B2: Business PresentationSkills

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA B2: Business Presentation SkillsCourse level; Direct - Other

Details/Description: As part of the BMKT 586 courses,student teams completed a marketing audit of acompany. The marketing audit reviewed the currentmarketing practices of a company. The audit was awritten document, but the team also presented theresults of the audit to the class.

MBA program assessment of each student’s businesspresentation skills used the marketing auditpresentation. Each student was required to participate inthe presentation and spoke for a minimum of fourminutes. The students received an overall assessmentof their skills and an assessment on each of thefollowing five traits.

OrganizationVisual aids

Findings for MBA B2: Business Presentation Skills

Summary of Findings:

Overall:

Meets + Expectations: 94.34%

Above Expectations: 20.75%Meets Expectations: 73.59%Below Expectations: 5.66%

Traits:

1) Organization

Meets + Above Expectations: 92.45%

Above Expectations: 20.75%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

7 of 18 8/6/2010 1:00 PM

Page 135: Completed Academic Program Assessment Plans 20082009

Eye contactElocutionMannerisms

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Spring 2009

Key/Responsible Personnel: Jim Henley

Supporting Attachments:

Meets Expectations: 71.7%Below Expectations: 7.55%

2) Visual aids

Meets + Above Expectations: 77.36%

Above Expectations: 9.44%Meets Expectations: 67.92%Below Expectations: 22.64%

3) Eye contact

Meets + Above Expectations: 71.7%

Above Expectations: 20.75%Meets Expectations: 50.95%Below Expectations: 28.3%

4) Elocution

Meets + Above Expectations: 98.11%

Above Expectations: 30.19%Meets Expectations: 67.92%Below Expectations: 1.89%

5) Mannerisms

Meets + Above Expectations: 75.47%

Above Expectations: 16.98%Meets Expectations: 58.49%Below Expectations: 24.53%

Target Achievement: Met

Recommendations : Overall, no changes arerecommended. In the traits, the visual aids, eye contactand mannerism traits need improvement.

Notes :

Substantiating Evidence:

MBA Goal C: Integration

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA C: IntegrationCourse level; Direct - Exam

Details/Description: The College assessed MBAObjectives C, F1, F3, F6, and F7, with the EducationalTesting Service’s (ETS) MBA Test. The ETS MBA Testgave a score for each area. The test consisted of 124multiple-choice questions with half of the questionsbased on short cases. The scores, compared to nationalMBA Test scores, ranked COB students.

Findings for MBA C: Integration

Summary of Findings: 90th percentile in Fall 2008.85th percentile in Spring 2009.80th percentile in Summer 2009.

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

8 of 18 8/6/2010 1:00 PM

Page 136: Completed Academic Program Assessment Plans 20082009

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008, Spring2009, and Summer 2009

Key/Responsible Personnel: Kathleen Wheatley

Supporting Attachments:

Notes :

Substantiating Evidence:

MBA Goal D: Critical Thinking

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA D: Critical ThinkingCourse level; Direct - Other

Details/Description: A large part of BUSA 587 isdedicated to the Marketplace Simulation. The studentswill make quarterly decisions to bring their companyfrom startup to full stage production. Each quarter,teams need to purchase market research, analyze theinformation, and make a set of decisions to move thecompany forward. The Marketplace Simulation has a builtin Assurance of Learning Assessment (AOLA) thatstudents take individually. The AOLA looks at individualfunctional areas as well as analysis skills and criticalthinking. For Objective C students will be assessed ontheir total score of AOLA Sections II through VI.

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Spring 2009

Key/Responsible Personnel: Kathleen Wheatley

Supporting Attachments:

Findings for MBA D: Critical Thinking

Summary of Findings:

Meets + Expectations: 100%

Above Expectations: 7%Meets Expectations: 93%Below Expectations: 0%

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

MBA Goal E: Ethics

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA E: EthicsCourse level; Direct - Other

Details/Description: Students will read and comparethe two court decisions in United States v. Causey etal., 2005 U.S. Dist. Lexis 39619 and United States v.Martha Stewart, 305 F. Supp. 2d 368 (SDNY 2004). Thestudents were rated on five traits and the overallobjective.

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Brian Finley

Supporting Attachments:

Findings for MBA E: Ethics

Summary of Findings:

Overall:

Meets + Expectations: 95%

Above Expectations: 15%Meets Expectations: 80%Below Expectations: 5%

Traits:

1) Identification of ethical issues.

Meets + Above Expectations: 90%

Above Expectations: 35%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

9 of 18 8/6/2010 1:00 PM

Page 137: Completed Academic Program Assessment Plans 20082009

Meets Expectations: 55%Below Expectations: 10%

2) Identification of leagl issues.

Meets + Above Expectations: 100%

Above Expectations: 20%Meets Expectations: 80%Below Expectations: 0%

3) Identification of alternative courses of action.

Meets + Above Expectations: 85%

Above Expectations: 10%Meets Expectations: 75%Below Expectations: 15%

4) Consideration of stakeholders and analysis ofalternative actions.

Meets + Above Expectations: 75%

Above Expectations: 5%Meets Expectations: 70%Below Expectations: 25%

5) Consideration of the impact of unethical behavior onan organization.

Meets + Above Expectations: 85%

Above Expectations: 0%Meets Expectations: 85%Below Expectations: 15%

Target Achievement: Met

Recommendations : Overall, no changes arerecommended. Trait 4 needs improvement.

Notes :

Substantiating Evidence:

MBA Goal F1: Accounting

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA F1: AccountingCourse level; Direct - Exam

Details/Description: The College assessed MBAObjectives C, F1, F3, F6, and F7, with the EducationalTesting Service’s (ETS) MBA Test. The ETS MBA Testgave a score for each area. The test consisted of 124multiple-choice questions with half of the questionsbased on short cases. The scores, compared to nationalMBA Test scores, ranked COB students.

Findings for MBA F1: Accounting

Summary of Findings:

80th percentile in Fall 2008.85th percentile in Spring 2009.85th percentile in Summer 2009.

Target Achievement: Exceeded

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

10 of 18 8/6/2010 1:00 PM

Page 138: Completed Academic Program Assessment Plans 20082009

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008, Spring2009, and Summer 2009

Key/Responsible Personnel: Kathleen Wheatley

Supporting Attachments:

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

MBA Goal F2: Economics

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA F2: EconomicsCourse level; Direct - Exam

Details/Description: Economics is a prerequisite for theabove finance and marketing courses. Consequently, theassessment of economics knowledge for MBA studentstakes place in these classes. The assessment consistsof 22 questions covering economics.

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Mike Long

Supporting Attachments:

Findings for MBA F2: Economics

Summary of Findings:Meets + Above Expectations: 64%

Above Expectations: 44.3%Meets Expectations: 19.7%Below Expectations: 36%

Questions:Q1 - 88.5% correctQ2 - 93.4% correctQ3 - 85.2% correctQ4 - 75.4% correctQ5 - 78.7% correctQ6 - 88.5% correctQ7 - 96.7% correctQ8 - 95.1% correctQ9 - 73.8% correctQ10- 88.5% correctQ11- 59.0% correctQ12- 75.4% correctQ13- 83.6% correctQ14- 88.5% correctQ15- 67.2% correctQ16- 77.0% correctQ17- 65.6% correctQ18- 88.5% correctQ19- 85.2% correctQ20- 80.3% correctQ21- 95.1% correctQ22- 63.9% correct

Target Achievement: Not Met

Recommendations : Economics scores need to bemonitored and improved.

Notes :

Substantiating Evidence:

MBA Goal F3: Management

Mapped to:

Measures & Findings

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

11 of 18 8/6/2010 1:00 PM

Page 139: Completed Academic Program Assessment Plans 20082009

USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

MBA F3: ManagementCourse level; Direct - Exam

Details/Description: The College assessed MBAObjectives C, F1, F3, F6, and F7, with the EducationalTesting Service’s (ETS) MBA Test. The ETS MBA Testgave a score for each area. The test consisted of 124multiple-choice questions with half of the questionsbased on short cases. The scores, compared to nationalMBA Test scores, ranked COB students.

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008, Spring2009, and Summer 2009

Key/Responsible Personnel: Kathleen Wheatley

Supporting Attachments:

Findings for MBA F3: Management

Summary of Findings:

85th percentile for Fall 2008.80th percentile for Spring 2009.90th percentile for Summer 2009.

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

MBA Goal F4: Business Statistics

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA F4: Business StatisticsCourse level; Direct - Exam

Details/Description: Four major areas of statistics arecovered in the course. They are:

1. Descriptive statistics2. Probability3. Statistical inference4. Regression and correlation

Students took course-embedded, specially-designedweekly assignments that thoroughly covered the abovetopics and measured the mastery of the topics.

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Mo Ahmadi

Supporting Attachments:

Findings for MBA F4: Business Statistics

Summary of Findings:Overall:

Meets + Expectations: 72.7%

Above Expectations: 45.4%Meets Expectations: 27.3%Below Expectations: 27.3%

Traits:

1) Descriptive statistics.

Meets + Above Expectations: 63.6%

Above Expectations: 45.4%Meets Expectations: 18.2%Below Expectations: 36.4%

2) Probability.

Meets + Above Expectations: 81.8%

Above Expectations: 54.5%Meets Expectations: 27.3%Below Expectations: 18.2%

3) Statistical inference.

Meets + Above Expectations: 72.7%

Above Expectations: 45.4%Meets Expectations: 27.3%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

12 of 18 8/6/2010 1:00 PM

Page 140: Completed Academic Program Assessment Plans 20082009

Below Expectations: 27.3%

4) Regression and correlation.

Meets + Above Expectations: 81.8%Above Expectations: 45.4%Meets Expectations: 36.4%Below Expectations: 18.2%

Target Achievement: Not Met

Recommendations : Statistics scores need to bemonitored and improved.

Notes :

Substantiating Evidence:

MBA Goal F4: Management Science

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA F4: Management ScienceCourse level; Direct - Exam

Details/Description: Dr. Parthasarati Dileepan willadminister the management science assessment inBMGT 583, Program and Operations Management, duringthe fall 2008 semester. The assessment covers six areasof management science.

Project managementMonte Carlo simulationStatistical process controlAcceptance samplingForecastingInventory control

Each student in BMGT 583 will answer at least oneassessment question on each of the six areas ofmanagement science. The questions will be embeddedinto the courses’ regular tests. The following scales willbe used to grade the students’ assessments.

Test Performance on the Exam QuestionsLess than 80% Below Expectations 1 pt.80% to 90% Meets Expectations 2 pts.Above 90% Above Expectations 3 pts.

Trait Assessment Score11 or less: Below Expectations 112 - 15: Meets Expectations 216 – 18: Above Expectations 3

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Dr. Parthasarati Dileepan

Supporting Attachments:

Findings for MBA F4: Management Science

Summary of Findings:

Overall:

Meets + Expectations: 61.4%

Above Expectations: 25%Meets Expectations: 36.4%Below Expectations: 38.6%

Traits:

1) Project management.

Meets + Above Expectations: 52.3%

Above Expectations: 34.1%Meets Expectations: 18.2%Below Expectations: 47.7%

2) Monte Carlo simulation.

Meets + Above Expectations: 84.1%

Above Expectations: 77.3%Meets Expectations: 6.8%Below Expectations: 15.9%

3) Statistical process control.

Meets + Above Expectations: 65.9%

Above Expectations: 47.7%Meets Expectations: 18.2%Below Expectations: 34.1%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

13 of 18 8/6/2010 1:00 PM

Page 141: Completed Academic Program Assessment Plans 20082009

4) Acceptance sampling.

Meets + Above Expectations: 47.7%

Above Expectations: 40.9%Meets Expectations: 6.8%Below Expectations: 52.3%

5) Forecasting.

Meets + Above Expectations: 59.1%

Above Expectations: 52.3%Meets Expectations: 6.8%Below Expectations: 40.9%

6) Inventory control.

Meets + Above Expectations: 50%

Above Expectations: 45.5%Meets Expectations: 4.5%Below Expectations: 50%

Target Achievement: Not Met

Recommendations : Management science scores needto be monitored and improved.

Notes :

Substantiating Evidence:

MBA Goal F5: Information Systems

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA F5: Information SystemsCourse level; Direct - Exam

Details/Description: For the MBA program assessmentof MIS, Beni Asllani embedded information systemsmeasures into the midterm and final exams of BMGT581, Management of Information Systems. Themeasures covered three learning outcomes and involvedthree questions, one problem, and one essay. Anoverview of the learning outcomes, embedded measures,and scoring scale is included in the following tab.

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Beni Asllani

Supporting Attachments:

Findings for MBA F5: Information Systems

Summary of Findings:

Overall:

Meets + Expectations: 95.1%

Above Expectations: 80.5%Meets Expectations: 14.6%Below Expectations: 4.9%

Traits:

1) Describe five major components of informationsystems.

Meets + Above Expectations: 100%

Above Expectations: 75.6%Meets Expectations: 24.4%Below Expectations: 0%

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

14 of 18 8/6/2010 1:00 PM

Page 142: Completed Academic Program Assessment Plans 20082009

2) Develop a simple information system applicationwhich accurately represents business requirements.

Meets + Above Expectations: 90.2%

Above Expectations: 90.2%Meets Expectations: 0%Below Expectations: 9.8%

3) Understand how information technology is used forbusiness process redesign and for competitiveadvantage.

Meets + Above Expectations: 92.7%

Above Expectations: 56.1%Meets Expectations: 36.6%Below Expectations: 7.3%

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

MBA Goal F6: Finance

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA F6: FinanceCourse level; Direct - Exam

Details/Description: The College assessed MBAObjectives C, F1, F3, F6, and F7, with the EducationalTesting Service’s (ETS) MBA Test. The ETS MBA Testgave a score for each area. The test consisted of 124multiple-choice questions with half of the questionsbased on short cases. The scores, compared to nationalMBA Test scores, ranked COB students.

Target: Students rank at the 50th percentile or abobeon the ETS exam.

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Kathleen Wheatley

Supporting Attachments:

Findings for MBA F6: Finance

Summary of Findings: 90th percentile for Fall 2008.80th percentile for Spring 2009.85th percentile for Summer 2009.

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

MBA Goal F7: Marketing

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA F7: MarketingCourse level; Direct - Exam

Details/Description: The College assessed MBAObjectives C, F1, F3, F6, and F7, with the EducationalTesting Service’s (ETS) MBA Test. The ETS MBA Testgave a score for each area. The test consisted of 124multiple-choice questions with half of the questions

Findings for MBA F7: Marketing

Summary of Findings:

85th percentile for Fall 2008.80th percentile for Spring 2009.85th percentile for Summer 2009.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

15 of 18 8/6/2010 1:00 PM

Page 143: Completed Academic Program Assessment Plans 20082009

based on short cases. The scores, compared to nationalMBA Test scores, ranked COB students.

Target: Students rank at the 50th percentile or aboveon the ETS exam.

Implementation Plan (timeline): Fall 2008

Key/Responsible Personnel: Kathleen Wheatley

Supporting Attachments:

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

MBA Goal F8: Legal Issues

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA F8: Legal IssuesCourse level; Direct - Exam

Details/Description: Students will read the factualscenario below and write 2-3 pages answering thequestions that follow. Students are expected todemonstrate the ability to identify and understand thebasic legal issues that are involved in a hypotheticalsituation. The students are evaluated on the followingtraits.

Identify Legal IssuesIdentify AlternativesConsider Impact on Stakeholder

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Spring 2009

Key/Responsible Personnel: Brian Finlay

Supporting Attachments:

Findings for MBA F8: Legal Issues

Summary of Findings:

Overall:

Meets + Expectations: 90%

Above Expectations: 10%Meets Expectations: 80%Below Expectations: 10%

Traits:1) Identify legal issuesMeets + Above Expectations: 95%Above Expectations: 10%Meets Expectations: 85%Below Expectations: 5%2) Identify alternativesMeets + Above Expectations: 95%Above Expectations: 25%Meets Expectations: 70%Below Expectations: 5%3) Consider impact on stakeholderMeets + Above Expectations: 95%Above Expectations: 15%Meets Expectations: 80%Below Expectations: 5%

Target Achievement: Exceeded

Recommendations : No changes are recommended.

Notes :

Substantiating Evidence:

MBA Goal F9: International Issues

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section

Measures & Findings

MBA F9: International AssessmentCourse level; Direct - Other

Details/Description: As part of BUSA 587, Strategic

Findings for MBA F9: International Assessment

Summary of Findings:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

16 of 18 8/6/2010 1:00 PM

Page 144: Completed Academic Program Assessment Plans 20082009

3): 3.3.1, 3.3.1.1 Management, students compete in an internationalmicro industry simulation. They are forced to analyze thesimilarities and differences between different geographicregions and consider this information when makingbusiness decisions (such as plant location, localresponsiveness, pricing differences, differences inmarket size, geographic expansion plan). At the end ofthe course the students have a take home exam. Thefollowing questions are incorporated this exam.

International Question: 1) Identify and discuss thevarious global factors you faced in the simulation. 2)Discuss your analysis of the impact of these globalfactors. 3) Discuss how these factors directly influencedyour business decisions.

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Spring 2009

Key/Responsible Personnel: Kathleen Wheatley

Supporting Attachments:

Overall:

Meets + Expectations: 78.57%

Above Expectations: 21.43%Meets Expectations: 57.14%Below Expectations: 21.43%

Traits:

1) Identification of global factors

Meets + Above Expectations: 71.43%

Above Expectations: 28.57%Meets Expectations: 42.86%Below Expectations: 28.57%

2) Analysis of global factors

Meets + Above Expectations: 78.57%

Above Expectations: 35.71%Meets Expectations: 42.86%Below Expectations: 28.57%

3) Application of analysis to management situation

Meets + Above Expectations: 85.71%

Above Expectations: 35.71%Meets Expectations: 50%Below Expectations: 14.29%

Target Achievement: Not Met

Recommendations : International scores need tomonitored and improved.

Notes :

Substantiating Evidence:

MBA Goal F10: Entrepreneurship

Mapped to:USA- AACSB- Standards: Assurance of learningstandards 4.2,USA- SACS- Comprehensive Standards (Section3): 3.3.1, 3.3.1.1

Measures & Findings

MBA F10: Entrepreneurship AssessmentCourse level; Direct - Exam

Details/Description: The MBA assessment ofentrepreneurship will occur in BETR588. Twenty eightmeasures, covering fourteen outcome categories, will beembedded into the midterm and final exams. Themidterm will cover the first seven categories withfourteen questions while the final will cover the lastseven categories with fourteen questions. The measuresare intended to assess the students’ understanding ofgeneral concepts in entrepreneurship that are consideredto be key components of the discipline.

Findings for MBA F10: EntrepreneurshipAssessment

Summary of Findings:

Meets + Expectations: 6.2%

Above Expectations: 0%Meets Expectations: 6.2%Below Expectations: 93.7%

Target Achievement: Not Met

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

17 of 18 8/6/2010 1:00 PM

Page 145: Completed Academic Program Assessment Plans 20082009

Target: Meets + above expectations are equal to orgreater than 85%.

Implementation Plan (timeline): Spring 2009

Key/Responsible Personnel: Bev Brockman

Supporting Attachments:

Recommendations : Entrepreneurship scores need tomonitored and improved.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

18 of 18 8/6/2010 1:00 PM

Page 146: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Early Childhood Education: BS

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Early Childhood Education: BSOutcome Set

Outcomes

PRAXIS II Principles of Learning andTeaching

Mapped to:USA- INTASC-Principles (Model Standards forBeginner Teacher Licensing and Development):Key Indicator 1.1, Key Indicator 1.2, Key Indicator1.3, Key Indicator 1.4, Key Indicator 1.5, KeyIndicator 1.6, Key Indicator 3.1, Key Indicator 3.2,Key Indicator 3.3, Key Indicator 3.4, Key Indicator3.5, Key Indicator 3.6, Key Indicator 4.1, KeyIndicator 4.2, Key Indicator 6.1, Key Indicator 6.2,Key Indicator 6.3, Key Indicator 6.4, Key Indicator7.1, Key Indicator 7.2, Key Indicator 7.3, KeyIndicator 7.4,USA- NCATE- Unit Standards: Sub-Standard 1c.,Sub-Standard 1d., Sub-Standard 3c., Sub-Standard4a., Sub-Standard 4d.

Measures & Findings

PRAXIS II - PLTProgram level; Direct - Exam

Details/Description: All students completing a programof study which leads to initial licensure will achieve apassing score on the required Principles of Learning andTeaching test(s).

Target: An additional two percent of students in initiallicensure programs will achieve the required score onmandated PLT Praxis II tests the first time they take thetest.

Implementation Plan (timeline): Performance ofstudents will be monitored each testing year for eachrequired PLT test.

Key/Responsible Personnel: Connie Cloud,Certification OfficerSandra Jones, TPA Admnistrative AssistantValerie Rutledge, TPA Department Head

Supporting Attachments:

Findings for PRAXIS II - PLT

Summary of Findings: 100% of students who completea licensure track program in education achieved apassing score on the required Praxis II Principals ofTeaching and Learning test for Early Childhood.

Target Achievement: Met

Recommendations : Although 100% of candidatesachieve a passing score on the required PLT for theirprogram, additional attention and analysis should revealhow many students must take the required test morethan one time. Faculty teaching in this program will takethe test to insure accurate and up to date informationabout the content and format of the test is provided tostudents.

Notes : Development is the most important part of thistest. It would be most helpful to analyze students'performance to see if there are specific areas on whichstudent scores could be improved. This analysis will beused by faculty to revise course content to assure that itis most closely aligned with the content of the PLT forthis level.

Substantiating Evidence:

Praxis II - Content

Mapped to:USA- INTASC-Principles (Model Standards forBeginner Teacher Licensing and Development):

Measures & Findings

Praxis IIProgram level; Direct - Exam

Findings for Praxis II

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 4 8/6/2010 1:15 PM

Page 147: Completed Academic Program Assessment Plans 20082009

Key Indicator 1.1, Key Indicator 1.2, Key Indicator1.3, Key Indicator 1.4, Key Indicator 1.5, KeyIndicator 1.6,USA- NCATE- Unit Standards: Sub-Standard 1a,Sub-Standard 1b., Sub-Standard 1c.

Details/Description: All students completing a programof study which leads to initial licensure for which aPraxis II test is required will achieve a passing score onthe required content area test(s).

Target: Two percent of students in education programswill achieve the required score on mandated Praxis IIcontent tests the first time they take the test(s).

Implementation Plan (timeline): Performance ofstudents will be monitored each testing year for eachsubset of required content tests.

Key/Responsible Personnel: Connie Cloud,Certification OfficerSandra Jones, TPA Administrative AssistantValerie Rutledge, TPA Department HeadArts and Sciences Department Heads - Sciences, Math,Social Sciences, English

Supporting Attachments:

Summary of Findings: Students taking the Praxis IItests for Early Childhood have five tests besides thePLT. The results vary for each of these. As a result,particular attention will be paid to the Special EducationPraxis 0690. This test has a pass rate of 100% forcompleters, but more than 30 percent of students musttake the test more than once.

Target Achievement: Met

Recommendations : Provide additional test preparationsessions which focus on the content covered in theSpecial Education Praxis 0690. Obtain additional testprep materials which can be made available to studentsto help them prepare. Have faculty members take thistest to insure that the content of courses addresses thespecific areas included on the test.

Notes : Faculty members who teach in this program willtake the Praxis II tests for this level of licensure toinsure that the courses they teach are appropriatelyaligned with the content over which students are tested.

Substantiating Evidence:

Clinical Experiences - Education

Mapped to:USA- INTASC-Principles (Model Standards forBeginner Teacher Licensing and Development):Key Indicator 4.1, Key Indicator 4.2, Key Indicator4.3,USA- NCATE- Unit Standards: Sub-Standard 3a.,Sub-Standard 3b., Sub-Standard 3c., Sub-Standard4a., Sub-Standard 4b., Sub-Standard 4c.,Sub-Standard 4d.

Measures & Findings

Clinical Experiences - EducationCourse level; Direct - Student Artifact

Details/Description: Students will demonstratecommitment to the profession by successfullycompleting required clinical placements and relatedcomponents. Reflections addressing specific questionswill be submitted for those classes in which clinicalexperiences are required and will be evaluated via acommon departmental rubric.

Target: Eighty percent of students completing clinicalplacements will achieve a score of acceptable or targeton the rubric used to evaluate reflections related torequired clinical placements.

Implementation Plan (timeline): Reflections submittedby students related to clinical placements will becollected and evaluated each semester for those courseswhich require placement hours.

Key/Responsible Personnel: TPA Tenure Track FacultyTPA Adjunct FacultyCarl Raus, Field Placement CoordinatorJan Gould, College Advisor

Supporting Attachments:

Findings for Clinical Experiences - Education

Summary of Findings: Students' field placements andrelated components are evaluated using the dispositionsevaluation instrument which has been adopted by thedepartment. This information is collected for eachstudent and becomes a part of that students' fieldplacement file.

Target Achievement: Met

Recommendations : Continue to evaluate studentsperformance of required field placements andcomponents with the goal of insuring that these includea significant range of diverse experiences in a variety ofsettings. Review and consider refining the rubric used toevaluate field placements to insure that it addresses allaspects of this objective.

Notes : In order to make field experiences more relevantand valuable, professors in residence will be required topresent seminars on specific topics including reflectionabout what students have learned and how they mightapply this knowledge to their own classrooms.

Substantiating Evidence:

Clinical Experiences - Child and FamilyStudies Measures & Findings

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 4 8/6/2010 1:15 PM

Page 148: Completed Academic Program Assessment Plans 20082009

Mapped to:USA- NCATE- Unit Standards: Sub-Standard 3a.,Sub-Standard 3b., Sub-Standard 3c., Sub-Standard4a., Sub-Standard 4b., Sub-Standard 4c.,Sub-Standard 4d.

Clinical Experiences - Child and FamliyStudiesCourse level; Direct - Student Artifact

Details/Description: Students will demonstratecommitment to the profession by successfullycompleting required clinical placements and relatedcomponents. Reflections addressing specific questionswill be submitted for those classes in which clinicalexperiences are required and will be evaluated via acommon departmental rubric.

Target: Eighty percent of students completing clinicalplacements will achieve a score of acceptable or targeton the rubric used to evaluate reflections related torequired clinical placements.

Implementation Plan (timeline): Reflections submittedby students related to clinical placements will becollected and evaluated each semester for those courseswhich require placement hours.

Key/Responsible Personnel: Dr. Cheryl Robinson, TPAChild and Family Studies ProfessorDr. Valerie Rutledge, TPA Department Head

Supporting Attachments:

Findings for Clinical Experiences - Child and FamliyStudies

Summary of Findings: Students' field placements andrelated components are evaluated using the dispositionsevaluation instrument which has been adopted by thedepartment. This information is collected for eachstudent and becomes a part of that students' fieldplacement file.

Target Achievement: Met

Recommendations : Continue to evaluate studentsperformance of required field placements andcomponents with the goal of insuring that these includea significant range of diverse experiences in a variety ofsettings. Review and consider refining the rubric used toevaluate field placements to insure that it addresses allaspects of this objective.

Notes : Child and Family Studies majors will be requiredto reflect on their field placement experiences to see ifwhat they are taking away from these placements isproviding them with the background and knowledgewhich will prove most relevant for their programs ofstudy.

Substantiating Evidence:

Assessment Module

Mapped to:USA- INTASC-Principles (Model Standards forBeginner Teacher Licensing and Development):Key Indicator 8.1, Key Indicator 8.2, Key Indicator8.3, Key Indicator 8.4, Key Indicator 8.5, KeyIndicator 8.6,USA- NCATE- Unit Standards: Sub-Standard 2a.,Sub-Standard 2b., Sub-Standard 2c., Sub-Standard4a., Sub-Standard 5e.

Measures & Findings

Assessment ModuleCourse level; Indirect - Survey

Details/Description: Students will complete therequired assessment module (part 1) and relatedevaluation to demonstration their recognition of theimprotance of this informaiton.

Target: Seventy-five percent of students responding tothe survey and evaluation intsrument related toAssessment Module of the Degree +3 program willrespond that they recognize the importance of theinformation offered during this instrument.

Implementation Plan (timeline): All students in EDUC201 will complete Assessment Module 1 of the Degree +3 program. Following completion, they will respond tothis instrument with at least 75% indicating that theyrecognize the importance/value of the informationpresented.

Key/Responsible Personnel: Dr. Pam CarterDr. Kim Wingate

Supporting Attachments:

Findings for Assessment Module

Summary of Findings: All students who are eitherenrolled in 201 at UTC or who are attempting to meetCheckpoint 1 completed the required assessmentmodule. Results of evaluations of this experience revealthat over 75% recognize and understand the value ofthis information to them as future educators who will beheld accountable for analyzing their own students'performance on standardized instruments.

Target Achievement: Exceeded

Recommendations : Expand the program to includegraduate students seeking initial licensure. In addition,continue to work to develop and begin to presentassessment module 2 to all education students pursuinglicensure. This module is part of the next more intensivefocus on use of assessment information to makedecisions about effective teaching methods.

Notes : Because of the value and importance placed onthe effective use of student data to make instructionaldecisions, this module will continue to be refined anddeveloped to insure that students will have experienceand expertise in analyzing student performance with thegoal of matching student needs with instructional

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 4 8/6/2010 1:15 PM

Page 149: Completed Academic Program Assessment Plans 20082009

techniques.

Substantiating Evidence:

Dispositions

Mapped to:USA- INTASC-Principles (Model Standards forBeginner Teacher Licensing and Development):Key Indicator 9.1, Key Indicator 9.2, Key Indicator9.3,USA- NCATE- Unit Standards: Sub-Standard 1g.,Sub-Standard 3a., Sub-Standard 3c., Sub-Standard5b., Sub-Standard 5c., Sub-Standard 5d.,Sub-Standard 5f.

Measures & Findings

Teacher Licensure Program DispositionsCourse level; Indirect - Survey

Details/Description: Professors will evaluate students'performance in the identified dispositions throughcompletion of a reflection required in specific courses ineach program.

Target: Professor will evaluate student demonstrationof commitment to dispositions adopted by the unit asacceptable or target in 75% of the disposition areas.

Implementation Plan (timeline): Students will submit adisposition reflective paper via LiveText to professors inselected education courses and these papers will beanalyzed to determine the level of performancedemonstrated by students. Furthermore, data will bedisaggrated at the program level for purposes ofprogram review and revision to address disposition-related behaviors.

Key/Responsible Personnel: TPA Tenure-Track FacultyTPA Adjunct FacultyCarl Raus, Field Placement CoordinatorSandra Jones, TPA Administrative AssistantValerie Rutledge, TPA Department Head

Supporting Attachments:

Findings for Teacher Licensure ProgramDispositions

Summary of Findings: Faculty are asked to completethe dispositions assessment instrument for students intheir courses. This information is collected and compiledfor each student. Particular attention has been given toevaluating the commitment of students to professionalgrowth through their attendance at and participation in arange of activities. This documentation reveals thatmore students could engage in professional growthactivities.

Target Achievement: Not Met

Recommendations : Offer more opportunities forparticipation in professional activities, develop seminarswhich address student needs and interests related toeducation, and encourage students to investigate a widerange of possible professional development options.

Notes : Additional ways of measuring studentdispositions will be identified and these will beimplemented to insure that data about students'performance in these areas will be collected andanalyzed.

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 4 8/6/2010 1:15 PM

Page 150: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Initial Licensure MEd

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Initial Licensure MEd Outcome Set

Outcomes

Student Retention

Mapped to:No Mapping

Measures & Findings

Completion RateProgram level; Direct - Other

Details/Description: M.Ed. students who are admittedto candidacy will complete their Masters degree withinthree years from the time of their initial enrollment.

Target: 90%

Implementation Plan (timeline): Semester

Key/Responsible Personnel: Department Head

Supporting Attachments:

No Findings Added to Completion Rate

Student Program Satisfaction

Mapped to:No Mapping

Measures & Findings

Program SatisfactionProgram level; Indirect - Survey

Details/Description: Those students who receiveMasters degree will express satisfaction with programdelivery, course availability, and relevance of coursework as reported by means of a questionnaire.

Target: 75%

Implementation Plan (timeline): Semester

Key/Responsible Personnel: Department Head

Supporting Attachments:

No Findings Added to Program Satisfaction

License Obtainment

Mapped to:

Measures & Findings

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 2 8/6/2010 1:16 PM

Page 151: Completed Academic Program Assessment Plans 20082009

No Mapping Professional LicensureProgram level; Direct - Other

Details/Description: M.Ed. students who complete theirprograms at UTC will submit appropriate documentationto their State Department of Education for additionaldegree status and their teaching license.

Target: 100%

Implementation Plan (timeline): Semester

Key/Responsible Personnel: Department Head

Supporting Attachments:

No Findings Added to Professional Licensure

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 2 8/6/2010 1:16 PM

Page 152: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Middle Grades Education: BS

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Middle Grades Education: BSOutcome Set

Outcomes

Praxis II Principles of Learning andTeaching

Mapped to:USA- INTASC-Principles (Model Standards forBeginner Teacher Licensing and Development):Key Indicator 1.1, Key Indicator 1.2, Key Indicator1.3, Key Indicator 1.4, Key Indicator 1.5, KeyIndicator 1.6, Key Indicator 3.1, Key Indicator 3.2,Key Indicator 3.3, Key Indicator 3.4, Key Indicator3.5, Key Indicator 3.6, Key Indicator 4.1, KeyIndicator 4.2, Key Indicator 6.1, Key Indicator 6.2,Key Indicator 6.3, Key Indicator 6.4, Key Indicator7.1, Key Indicator 7.2, Key Indicator 7.3, KeyIndicator 7.4,USA- NCATE- Unit Standards: Sub-Standard 1c.,Sub-Standard 1d., Sub-Standard 3c., Sub-Standard4a., Sub-Standard 4d.

Measures & Findings

Praxis II - PLTProgram level; Direct - Exam

Details/Description: All students completing a programof study which leads to initial licensure will achieve apassing score on the required Principles of Learning andTeaching test(s).

Target: An additional two percent of students in initiallicensure programs will achieve the required score onmandated PLT Praxis II tests the first time they take thetest.

Implementation Plan (timeline): Performance ofstudents will be monitored each testing year for eachrequired PLT test.

Key/Responsible Personnel: Connie Cloud,Certification OfficerSandra Jones, TPA Admnistrative AssistantValerie Rutledge, TPA Department Head

Supporting Attachments:

Findings for Praxis II - PLT

Summary of Findings: 100% of students who completea licensure track program in education achieved apassing score on the required Praxis II Principals ofTeaching and Learning test.

Target Achievement: Met

Recommendations : Although 100% of candidatesachieve a passing score on the required PLT for theirprogram, additional attention and analysis should revealhow many students must take the required test morethan one time. Faculty teaching in this program will takethe test to insure accurate and uptodate informationabout the content and format of the test is provided tostudents.

Notes :

Substantiating Evidence:

Praxis II - Content

Mapped to:USA- INTASC-Principles (Model Standards forBeginner Teacher Licensing and Development):Key Indicator 1.1, Key Indicator 1.2, Key Indicator1.3, Key Indicator 1.4, Key Indicator 1.5, KeyIndicator 1.6,USA- NCATE- Unit Standards: Sub-Standard 1a,Sub-Standard 1b., Sub-Standard 1c.

Measures & Findings

Praxis IIProgram level; Direct - Exam

Details/Description: All students completing a programof study which leads to initial licensure for which aPraxis II test is required will achieve a passing score onthe required content area test(s).

Target: Two percent of students in education programs

Findings for Praxis II

Summary of Findings: Students taking the Praxis IItests for Middle Grades have two tests besides the PLT.The results vary for each of these. As a result, particularattention will be paid to the Middle Grades Content test.This test has a pass rate of 100% for completers, but

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 4 8/6/2010 1:14 PM

Page 153: Completed Academic Program Assessment Plans 20082009

will achieve the required score on mandated Praxis IIcontent tests the first time they take the test(s).

Implementation Plan (timeline): Performance ofstudents will be monitored each testing year for eachsubset of required content tests.

Key/Responsible Personnel: Connie Cloud,Certification OfficerSandra Jones, TPA Administrative AssistantValerie Rutledge, TPA Department HeadArts and Sciences Department Heads - Sciences, Math,Social Sciences, English

Supporting Attachments:

more than 30 percent of students must take the testmore than once.

Target Achievement: Met

Recommendations : Provide test preparation sessionswhich focus on the content covered in the Middle GradesContent test. Obtain additional test prep materialswhich can be made available to students to help themprepare. Have faculty members take this test to insurethat the content of courses addresses the specific areasincluded on the test.

Notes :

Substantiating Evidence:

Clinical Experiences - Education

Mapped to:USA- INTASC-Principles (Model Standards forBeginner Teacher Licensing and Development):Key Indicator 4.1, Key Indicator 4.2, Key Indicator4.3,USA- NCATE- Unit Standards: Sub-Standard 3a.,Sub-Standard 3b., Sub-Standard 3c., Sub-Standard4a., Sub-Standard 4b., Sub-Standard 4c.,Sub-Standard 4d.

Measures & Findings

Clinical Experiences - EducationCourse level; Direct - Student Artifact

Details/Description: Students will demonstratecommitment to the profession by successfullycompleting required clinical placements and relatedcomponents. Reflections addressing specific questionswill be submitted for those classes in which clinicalexperiences are required and will be evaluated via acommon departmental rubric.

Target: Eighty percent of students completing clinicalplacements will achieve a score of acceptable or targeton the rubric used to evaluate reflections related torequired clinical placements.

Implementation Plan (timeline): Reflections submittedby students related to clinical placements will becollected and evaluated each semester for those courseswhich require placement hours.

Key/Responsible Personnel: TPA Tenure Track FacultyTPA Adjunct FacultyCarl Raus, Field Placement CoordinatorJan Gould, College Advisor

Supporting Attachments:

Findings for Clinical Experiences - Education

Summary of Findings: Students' field placements andrelated components are evaluated using the dispositionsevaluation instrument which has been adopted by thedepartment. This information is collected for eachstudent and becomes a part of that students' fieldplacement file.

Target Achievement: Met

Recommendations : Continue to evaluate studentsperformance of required field placements andcomponents with the goal of insuring that these includea significant range of diverse experiences in a variety ofsettings. Review and consider refining the rubric used toevaluate field placements to insure that it addresses allaspects of this objective.

Notes :

Substantiating Evidence:

Clinical Experiences - Child and FamilyStudies

Mapped to:USA- NCATE- Unit Standards: Sub-Standard 3a.,Sub-Standard 3b., Sub-Standard 3c., Sub-Standard4a., Sub-Standard 4b., Sub-Standard 4c.,Sub-Standard 4d.

Measures & Findings

Clinical Experiences - Child and FamliyStudiesCourse level; Direct - Student Artifact

Details/Description: Students will demonstratecommitment to the profession by successfullycompleting required clinical placements and relatedcomponents. Reflections addressing specific questionswill be submitted for those classes in which clinicalexperiences are required and will be evaluated via a

Findings for Clinical Experiences - Child and FamliyStudies

Summary of Findings: Students' field placements andrelated components are evaluated using the dispositionsevaluation instrument which has been adopted by thedepartment. This information is collected for eachstudent and becomes a part of that students' fieldplacement file.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 4 8/6/2010 1:14 PM

Page 154: Completed Academic Program Assessment Plans 20082009

common departmental rubric.

Target: Eighty percent of students completing clinicalplacements will achieve a score of acceptable or targeton the rubric used to evaluate reflections related torequired clinical placements.

Implementation Plan (timeline): Reflections submittedby students related to clinical placements will becollected and evaluated each semester for those courseswhich require placement hours.

Key/Responsible Personnel: Dr. Cheryl Robinson, TPAChild and Family Studies ProfessorDr. Valerie Rutledge, TPA Department Head

Supporting Attachments:

Target Achievement: Met

Recommendations : Continue to evaluate studentsperformance of required field placements andcomponents with the goal of insuring that these includea significant range of diverse experiences in a variety ofsettings. Review and consider refining the rubric used toevaluate field placements to insure that it addresses allaspects of this objective.

Notes :

Substantiating Evidence:

Assessment Module

Mapped to:USA- INTASC-Principles (Model Standards forBeginner Teacher Licensing and Development):Key Indicator 8.1, Key Indicator 8.2, Key Indicator8.3, Key Indicator 8.4, Key Indicator 8.5, KeyIndicator 8.6,USA- NCATE- Unit Standards: Sub-Standard 2a.,Sub-Standard 2b., Sub-Standard 2c., Sub-Standard4a., Sub-Standard 5e.

Measures & Findings

Assessment ModuleCourse level; Indirect - Survey

Details/Description: Students will complete therequired assessment module (part 1) and relatedevaluation to demonstration their recognition of theimprotance of this informaiton.

Target: Seventy-five percent of students responding tothe survey and evaluation intsrument related toAssessment Module of the Degree +3 program willrespond that they recognize the importance of theinformation offered during this instrument.

Implementation Plan (timeline): All students in EDUC201 will complete Assessment Module 1 of the Degree +3 program. Following completion, they will respond tothis instrument with at least 75% indicating that theyrecognize the importance/value of the informationpresented.

Key/Responsible Personnel: Dr. Pam CarterDr. Kim Wingate

Supporting Attachments:

Findings for Assessment Module

Summary of Findings: All students who are eitherenrolled in 201 at UTC or who are attempting to meetCheckpoint 1 completed the required assessmentmodule. Results of evaluations of this experience revealthat over 75% recognize and understand the value ofthis information to them as future educators who will beheld accountable for analyzing their own students'performance on standardized instruments.

Target Achievement: Exceeded

Recommendations : Expand the program to includegraduate students seeking initial licensure. In addition,continue to work to develop and begin to presentassessment module 2 to all education students pursuinglicensure. This module is part of the next more intensivefocus on use of assessment information to makedecisions about effective teaching methods.

Notes :

Substantiating Evidence:

Dispositions

Mapped to:USA- INTASC-Principles (Model Standards forBeginner Teacher Licensing and Development):Key Indicator 9.1, Key Indicator 9.2, Key Indicator9.3,USA- NCATE- Unit Standards: Sub-Standard 1g.,Sub-Standard 3a., Sub-Standard 3c., Sub-Standard5b., Sub-Standard 5c., Sub-Standard 5d.,Sub-Standard 5f.

Measures & Findings

Teacher Licensure Program DispositionsCourse level; Indirect - Survey

Details/Description: Professors will evaluate students'performance in the identified dispositions throughcompletion of a reflection required in specific courses ineach program.

Target: Professor will evaluate student demonstrationof commitment to dispositions adopted by the unit asacceptable or target in 75% of the disposition areas.

Findings for Teacher Licensure ProgramDispositions

Summary of Findings: Faculty are asked to completethe dispositions assessment instrument for students intheir courses. This information is collected and compiledfor each student. Particular attention has been given toevaluating the commitment of students to professionalgrowth through their attendance at and participation in arange of activities. This documentation reveals that

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 4 8/6/2010 1:14 PM

Page 155: Completed Academic Program Assessment Plans 20082009

Implementation Plan (timeline): Students will submit adisposition reflective paper via LiveText to professors inselected education courses and these papers will beanalyzed to determine the level of performancedemonstrated by students. Furthermore, data will bedisaggrated at the program level for purposes ofprogram review and revision to address disposition-related behaviors.

Key/Responsible Personnel: TPA Tenure-Track FacultyTPA Adjunct FacultyCarl Raus, Field Placement CoordinatorSandra Jones, TPA Administrative AssistantValerie Rutledge, TPA Department Head

Supporting Attachments:

more students could engage in professional growthactivities.

Target Achievement: Not Met

Recommendations : Offer more opportunities forparticipation in professional activities, develop seminarswhich address student needs and interests related toeducation, and encourage students to investigate a widerange of possible professional development options.

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 4 8/6/2010 1:14 PM

Page 156: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Nursing: BSN

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Monday, August 23, 2010

Measures and Findings

Nursing: BSN OutcomeSet

Outcomes

Synthesize theoretical andempirical knowledge

Mapped to:USA- CCNE- The Essentials ofBaccalaureate Education forProfessional Nursing Practice(2008): Essential I, Essential III,Essential IV, Essential IX

Measures & Findings

Student & Alumni SatisfactionProgram level; Indirect - Survey

Details/Description: AdvisementExit InterviewsAlumni Survey

Target: Current and graduating studentsand alumni

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student & AlumniSatisfaction

Summary of Findings: Advisement:Student satisfaction with advisement andstudent satisfaction as they exit from theprogram are other indicators of programeffectiveness. The SON advises allpre-nursing students, one of our largestcommunity of interest groups.Advisement is considered an importantcomponent of University effectiveness.Block advisement is performed withtraditional students enrolled inUndergraduate Program The Pre-NursingGroup evaluates the advising processeach semester (table 4.10). TheBenchmark for Pre-Nursing studentsatisfaction with advisement is 90%.University-wide advisement of SONstudents median score on a scale of 1 to6 is consistently 6. Student satisfactionwith the Pre-Nursing Advisement processcontinued to improve and SONadvisement of all students is consideredoutstanding by the University.

Exit Interviews:As a component of ongoing continuousquality improvement, exit interviews ofstudents at the end of the fifth semesterbegan in spring 2007. Prior to this time,satisfaction with the Program wasmeasured at one year post graduation bymail surveys. This method was ineffectivedespite many reminders; the rate ofreturn was dismal. Tables 4.11- 4.13present the Undergraduate Exit interviewquestions. Summary data of question (4):“is there anything else you would like toshare with us about your experiences atthe SON?” will be available on site.Because of the difference in studentpopulation and teaching methodology,the Gateway students complete an exitexam that measures their educationalexperience. Data will be available forreview in the Resource Room.

Alumni Survey:Alumni satisfaction with the program ismixed. For the May 08 graduates, thenumber of respondents was only 9 alumni

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

1 of 37 8/23/2010 10:57 AM

Page 157: Completed Academic Program Assessment Plans 20082009

at six months. Of those nine alumni,33-50% were dissatisfied with somemeasures but 77% were satisfied withskills and competencies acquired from theprogram. This is consistent with the exitinterviews; time did not change theirperception. Of the overall mixed resultssuch as classroom space, studentgathering area, and student in-put intothe program, we have responded andhave processes in place: the renovationof space should be complete fallsemester 2009 and we are actively tryingto involve the students in governance ofthe SON.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Student AchievementProgram level; Direct - Exam

Details/Description: ATI RN PredictorNCLEX examCollegiate Assessment of AcademicProficiency (CAAP)Graduation RateCritical Thinking

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student Achievement

Summary of Findings: ATI RN Predictor:The ATI RN-Predictor is used to measurethe likelihood of student success on firstattempt writing the NCLEX-RN exam. Thetrends of the subsections of theRN-Predictor are used to evaluate thecurriculum content and process. Themajority of students graduating inDecember 2007 and May 2008 wereenrolled in a required 3-semester creditNCLEX-preparation course, N440 in thefifth semester if they did not achieve theestablished benchmark, a score in the65th percentile or higher on the RNComprehensive Predictor. This benchmarkwas considered by ATI to be predictive ofsuccess on the NCLEX (table 4-3). TheATI recommended changing the scoringfor the Comprehensive Predictor frompercentile to percent probability ofpassing, and its recommended benchmarkwas 94% probability. The Undergraduatestudents take the exam at the end of thefourth semester, having one semesterbefore taking the NCLEX-RN. The SONcurrent benchmark is 98% probability ofpassing NCLEX-RN. Beginning withstudents in Level 5 fall semester 2008,the process changed to better utilize ourresources and student synthesis oflearning in this culminating semester.Students graduating in December 2008and May 2009 were now required to enrollin Virtual ATI (VATI) 2 weeks prior tograduation if their score on theRN-Predictor was less than thebenchmark of 98% probability of passingNCLEX. This is an online individualizedNCLEX preparation course that predicts99% probability of success. The twostudents who were not successful inDecember of 2008 on first time pass,were enrolled in VATI, but did notparticipate before taking NCLEX. This isreflected on the 2009 Education SummaryReport for the National Council of StateBoards of Nursing.

NCLEX-RN:The SON benchmark is at or above thenational percent passing of first time BSNcandidates as specified by the NationalCouncil Licensure Examination forRegistered Nurse. This is also the

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

2 of 37 8/23/2010 10:57 AM

Page 158: Completed Academic Program Assessment Plans 20082009

benchmark used by the UT System andTHEC for program effectiveness. The 2009pass rate only includes the class thatgraduated in December 2008. As ofAugust 2009, 23 of the 24 student whograduated in May 2009 have taken theNCLEX-RN exam: all have passed on thefirst attempt.

Pass rate for first time writers of theNCLEX-RN2006 Pass Rate=92.32007 Pass Rate=87.27%2008 Pass Rate=100%2009 Pass Rate=91.6%

Graduation Rate:The graduation rate for the BSN Programis calculated by dividing the number ofstudents entering the nursing major intothe number of students graduating sevensemesters later. The benchmark is 85%.The graduation rate for 2008 was 83%(table 4.5). This occurred because 4students withdrew from the program and3 students were dismissed for academicreasons. The four students who withdrewdid so for personal reasons such ascomplicated pregnancy, husbandrelocating, and other family situations.The first cohort of the RN to BSN Trackbegan May 2007. This group of students’graduation rate is 96%.

Traditional Student UndergraduateGraduation Rates (calculation based uponseven semesters)2006=93%2007=88%2008=83%

CAAP:All undergraduate graduating seniors arerequired to take the CollegiateAssessment of Academic Proficiency(CAAP), a standardized multiple-choicetest used to measure content areasrelated to writing skills, mathematics,reading, science, and critical thinking.Aggregated results are reported.University students are randomlyassigned to take only one of the fivetests mentioned during the semesterbefore they graduate. Since the exam isgiven after the student has completed allSON courses in addition to Universitygeneral education courses, faculty believethat comparing our students withUniversity and College students is ameans of gauging student learningeffectiveness.

In 2006, students graduating in thenursing major had higher mean scores inreading than other graduating students inthe College and higher means thanCollege and total University graduatingstudents in science reasoning and criticalthinking, but lower means in writing skillsand mathematics.

Critical Thinking:The SON also measures Critical Thinkingusing the ATI exam. There was nodifference in entry/exit Critical Thinkingscores for the January 2007 graduates.This is likely a regression toward themean phenomenon as this class enteredwith an unusually high mean, closer to

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

3 of 37 8/23/2010 10:57 AM

Page 159: Completed Academic Program Assessment Plans 20082009

the exit scores of the other classes. Thefirst cohort of Gateway students did notshow a significant increase in scores,however, the time between testing wasabbreviated as the students not giventhe entrance test until they had been inthe program for about 6 months. Thisoversight has been corrected.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Quality of nursing practice

Mapped to:USA- CCNE- The Essentials ofBaccalaureate Education forProfessional Nursing Practice(2008): Essential III, Essential IV

Measures & Findings

Employer SatisfactionProgram level; Indirect - Survey

Details/Description: BSN EmployerSurvey

Target: Employers of BSN graduates

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Employer Satisfaction

Summary of Findings: Obtaining surveydata from employers has historically beendifficult.

Target Achievement: Not Met

Recommendations : To distributesurveys at the yearly Nursing CommunityAdvisory Committee meeting, where thereis wide representation from the agenicesemploying graduates of ourundergraduate program.

Notes :

Substantiating Evidence:

Student AchievementProgram level; Direct - Exam

Details/Description: ATI RN PredictorNCLEX examCollegiate Assessment of AcademicProficiency (CAAP)Graduation RateCritical Thinking

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student Achievement

Summary of Findings: ATI RN Predictor:The ATI RN-Predictor is used to measurethe likelihood of student success on firstattempt writing the NCLEX-RN exam. Thetrends of the subsections of theRN-Predictor are used to evaluate thecurriculum content and process. Themajority of students graduating inDecember 2007 and May 2008 wereenrolled in a required 3-semester creditNCLEX-preparation course, N440 in thefifth semester if they did not achieve theestablished benchmark, a score in the65th percentile or higher on the RNComprehensive Predictor. This benchmarkwas considered by ATI to be predictive ofsuccess on the NCLEX (table 4-3). TheATI recommended changing the scoringfor the Comprehensive Predictor frompercentile to percent probability ofpassing, and its recommended benchmarkwas 94% probability. The Undergraduatestudents take the exam at the end of thefourth semester, having one semesterbefore taking the NCLEX-RN. The SONcurrent benchmark is 98% probability ofpassing NCLEX-RN. Beginning withstudents in Level 5 fall semester 2008,the process changed to better utilize ourresources and student synthesis oflearning in this culminating semester.Students graduating in December 2008and May 2009 were now required to enrollin Virtual ATI (VATI) 2 weeks prior tograduation if their score on theRN-Predictor was less than thebenchmark of 98% probability of passingNCLEX. This is an online individualized

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

4 of 37 8/23/2010 10:57 AM

Page 160: Completed Academic Program Assessment Plans 20082009

NCLEX preparation course that predicts99% probability of success. The twostudents who were not successful inDecember of 2008 on first time pass,were enrolled in VATI, but did notparticipate before taking NCLEX. This isreflected on the 2009 Education SummaryReport for the National Council of StateBoards of Nursing.

NCLEX-RN:The SON benchmark is at or above thenational percent passing of first time BSNcandidates as specified by the NationalCouncil Licensure Examination forRegistered Nurse. This is also thebenchmark used by the UT System andTHEC for program effectiveness. The 2009pass rate only includes the class thatgraduated in December 2008. As ofAugust 2009, 23 of the 24 student whograduated in May 2009 have taken theNCLEX-RN exam: all have passed on thefirst attempt.

Pass rate for first time writers of theNCLEX-RN2006 Pass Rate=92.32007 Pass Rate=87.27%2008 Pass Rate=100%2009 Pass Rate=91.6%

Graduation Rate:The graduation rate for the BSN Programis calculated by dividing the number ofstudents entering the nursing major intothe number of students graduating sevensemesters later. The benchmark is 85%.The graduation rate for 2008 was 83%(table 4.5). This occurred because 4students withdrew from the program and3 students were dismissed for academicreasons. The four students who withdrewdid so for personal reasons such ascomplicated pregnancy, husbandrelocating, and other family situations.The first cohort of the RN to BSN Trackbegan May 2007. This group of students’graduation rate is 96%.

Traditional Student UndergraduateGraduation Rates (calculation based uponseven semesters)2006=93%2007=88%2008=83%

CAAP:All undergraduate graduating seniors arerequired to take the CollegiateAssessment of Academic Proficiency(CAAP), a standardized multiple-choicetest used to measure content areasrelated to writing skills, mathematics,reading, science, and critical thinking.Aggregated results are reported.University students are randomlyassigned to take only one of the fivetests mentioned during the semesterbefore they graduate. Since the exam isgiven after the student has completed allSON courses in addition to Universitygeneral education courses, faculty believethat comparing our students withUniversity and College students is ameans of gauging student learningeffectiveness.

In 2006, students graduating in the

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

5 of 37 8/23/2010 10:57 AM

Page 161: Completed Academic Program Assessment Plans 20082009

nursing major had higher mean scores inreading than other graduating students inthe College and higher means thanCollege and total University graduatingstudents in science reasoning and criticalthinking, but lower means in writing skillsand mathematics.

Critical Thinking:The SON also measures Critical Thinkingusing the ATI exam. There was nodifference in entry/exit Critical Thinkingscores for the January 2007 graduates.This is likely a regression toward themean phenomenon as this class enteredwith an unusually high mean, closer tothe exit scores of the other classes. Thefirst cohort of Gateway students did notshow a significant increase in scores,however, the time between testing wasabbreviated as the students not giventhe entrance test until they had been inthe program for about 6 months. Thisoversight has been corrected.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Student and Alumni SatisfactionProgram level; Indirect - Survey

Details/Description: AdvisementExit InterviewsAlumni Survey

Target: Current and graduating studentsand alumni

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student and AlumniSatisfaction

Summary of Findings: Advisement:Student satisfaction with advisement andstudent satisfaction as they exit from theprogram are other indicators of programeffectiveness. The SON advises allpre-nursing students, one of our largestcommunity of interest groups.Advisement is considered an importantcomponent of University effectiveness.Block advisement is performed withtraditional students enrolled inUndergraduate Program The Pre-NursingGroup evaluates the advising processeach semester (table 4.10). TheBenchmark for Pre-Nursing studentsatisfaction with advisement is 90%.University-wide advisement of SONstudents median score on a scale of 1 to6 is consistently 6. Student satisfactionwith the Pre-Nursing Advisement processcontinued to improve and SONadvisement of all students is consideredoutstanding by the University.

Exit Interviews:As a component of ongoing continuousquality improvement, exit interviews ofstudents at the end of the fifth semesterbegan in spring 2007. Prior to this time,satisfaction with the Program wasmeasured at one year post graduation bymail surveys. This method was ineffectivedespite many reminders; the rate ofreturn was dismal. Tables 4.11- 4.13present the Undergraduate Exit interviewquestions. Summary data of question (4):“is there anything else you would like toshare with us about your experiences atthe SON?” will be available on site.Because of the difference in studentpopulation and teaching methodology,the Gateway students complete an exitexam that measures their educationalexperience. Data will be available for

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

6 of 37 8/23/2010 10:57 AM

Page 162: Completed Academic Program Assessment Plans 20082009

review in the Resource Room.

Alumni Survey:Alumni satisfaction with the program ismixed. For the May 08 graduates, thenumber of respondents was only 9 alumniat six months. Of those nine alumni,33-50% were dissatisfied with somemeasures but 77% were satisfied withskills and competencies acquired from theprogram. This is consistent with the exitinterviews; time did not change theirperception. Of the overall mixed resultssuch as classroom space, studentgathering area, and student in-put intothe program, we have responded andhave processes in place: the renovationof space should be complete fallsemester 2009 and we are actively tryingto involve the students in governance ofthe SON.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Leadership skills

Mapped to:USA- CCNE- The Essentials ofBaccalaureate Education forProfessional Nursing Practice(2008): Essential II, Essential V ,Essential VI, Essential VII

Measures & Findings

Employer SatisfactionProgram level; Indirect - Survey

Details/Description: BSN EmployerSurvey

Target: Employers of BSN graduates

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Employer Satisfaction

Summary of Findings: Obtaining surveydata from employers has historically beendifficult.

Target Achievement: Not Met

Recommendations : To distributesurveys at the yearly Nursing CommunityAdvisory Committee meeting, where thereis wide representation from the agenicesemploying graduates of ourundergraduate program.

Notes :

Substantiating Evidence:

Student & Alumni SatisfactionProgram level; Indirect - Survey

Details/Description: AdvisementExit InterviewsAlumni Survey

Target: Current and graduating studentsand alumni

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student & AlumniSatisfaction

Summary of Findings: Advisement:Student satisfaction with advisement andstudent satisfaction as they exit from theprogram are other indicators of programeffectiveness. The SON advises allpre-nursing students, one of our largestcommunity of interest groups.Advisement is considered an importantcomponent of University effectiveness.Block advisement is performed withtraditional students enrolled inUndergraduate Program The Pre-NursingGroup evaluates the advising processeach semester (table 4.10). TheBenchmark for Pre-Nursing studentsatisfaction with advisement is 90%.University-wide advisement of SONstudents median score on a scale of 1 to6 is consistently 6. Student satisfactionwith the Pre-Nursing Advisement processcontinued to improve and SONadvisement of all students is consideredoutstanding by the University.

Exit Interviews:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

7 of 37 8/23/2010 10:57 AM

Page 163: Completed Academic Program Assessment Plans 20082009

As a component of ongoing continuousquality improvement, exit interviews ofstudents at the end of the fifth semesterbegan in spring 2007. Prior to this time,satisfaction with the Program wasmeasured at one year post graduation bymail surveys. This method was ineffectivedespite many reminders; the rate ofreturn was dismal. Tables 4.11- 4.13present the Undergraduate Exit interviewquestions. Summary data of question (4):“is there anything else you would like toshare with us about your experiences atthe SON?” will be available on site.Because of the difference in studentpopulation and teaching methodology,the Gateway students complete an exitexam that measures their educationalexperience. Data will be available forreview in the Resource Room.

Alumni Survey:Alumni satisfaction with the program ismixed. For the May 08 graduates, thenumber of respondents was only 9 alumniat six months. Of those nine alumni,33-50% were dissatisfied with somemeasures but 77% were satisfied withskills and competencies acquired from theprogram. This is consistent with the exitinterviews; time did not change theirperception. Of the overall mixed resultssuch as classroom space, studentgathering area, and student in-put intothe program, we have responded andhave processes in place: the renovationof space should be complete fallsemester 2009 and we are actively tryingto involve the students in governance ofthe SON.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Student AchievementProgram level; Direct - Exam

Details/Description: ATI RN PredictorNCLEX examCollegiate Assessment of AcademicProficiency (CAAP)Graduation RateCritical Thinking

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student Achievement

Summary of Findings: ATI RN Predictor:The ATI RN-Predictor is used to measurethe likelihood of student success on firstattempt writing the NCLEX-RN exam. Thetrends of the subsections of theRN-Predictor are used to evaluate thecurriculum content and process. Themajority of students graduating inDecember 2007 and May 2008 wereenrolled in a required 3-semester creditNCLEX-preparation course, N440 in thefifth semester if they did not achieve theestablished benchmark, a score in the65th percentile or higher on the RNComprehensive Predictor. This benchmarkwas considered by ATI to be predictive ofsuccess on the NCLEX (table 4-3). TheATI recommended changing the scoringfor the Comprehensive Predictor frompercentile to percent probability ofpassing, and its recommended benchmarkwas 94% probability. The Undergraduatestudents take the exam at the end of thefourth semester, having one semesterbefore taking the NCLEX-RN. The SONcurrent benchmark is 98% probability ofpassing NCLEX-RN. Beginning withstudents in Level 5 fall semester 2008,the process changed to better utilize our

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

8 of 37 8/23/2010 10:57 AM

Page 164: Completed Academic Program Assessment Plans 20082009

resources and student synthesis oflearning in this culminating semester.Students graduating in December 2008and May 2009 were now required to enrollin Virtual ATI (VATI) 2 weeks prior tograduation if their score on theRN-Predictor was less than thebenchmark of 98% probability of passingNCLEX. This is an online individualizedNCLEX preparation course that predicts99% probability of success. The twostudents who were not successful inDecember of 2008 on first time pass,were enrolled in VATI, but did notparticipate before taking NCLEX. This isreflected on the 2009 Education SummaryReport for the National Council of StateBoards of Nursing.

NCLEX-RN:The SON benchmark is at or above thenational percent passing of first time BSNcandidates as specified by the NationalCouncil Licensure Examination forRegistered Nurse. This is also thebenchmark used by the UT System andTHEC for program effectiveness. The 2009pass rate only includes the class thatgraduated in December 2008. As ofAugust 2009, 23 of the 24 student whograduated in May 2009 have taken theNCLEX-RN exam: all have passed on thefirst attempt.

Pass rate for first time writers of theNCLEX-RN2006 Pass Rate=92.32007 Pass Rate=87.27%2008 Pass Rate=100%2009 Pass Rate=91.6%

Graduation Rate:The graduation rate for the BSN Programis calculated by dividing the number ofstudents entering the nursing major intothe number of students graduating sevensemesters later. The benchmark is 85%.The graduation rate for 2008 was 83%(table 4.5). This occurred because 4students withdrew from the program and3 students were dismissed for academicreasons. The four students who withdrewdid so for personal reasons such ascomplicated pregnancy, husbandrelocating, and other family situations.The first cohort of the RN to BSN Trackbegan May 2007. This group of students’graduation rate is 96%.

Traditional Student UndergraduateGraduation Rates (calculation based uponseven semesters)2006=93%2007=88%2008=83%

CAAP:All undergraduate graduating seniors arerequired to take the CollegiateAssessment of Academic Proficiency(CAAP), a standardized multiple-choicetest used to measure content areasrelated to writing skills, mathematics,reading, science, and critical thinking.Aggregated results are reported.University students are randomlyassigned to take only one of the fivetests mentioned during the semesterbefore they graduate. Since the exam is

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

9 of 37 8/23/2010 10:57 AM

Page 165: Completed Academic Program Assessment Plans 20082009

given after the student has completed allSON courses in addition to Universitygeneral education courses, faculty believethat comparing our students withUniversity and College students is ameans of gauging student learningeffectiveness.

In 2006, students graduating in thenursing major had higher mean scores inreading than other graduating students inthe College and higher means thanCollege and total University graduatingstudents in science reasoning and criticalthinking, but lower means in writing skillsand mathematics.

Critical Thinking:The SON also measures Critical Thinkingusing the ATI exam. There was nodifference in entry/exit Critical Thinkingscores for the January 2007 graduates.This is likely a regression toward themean phenomenon as this class enteredwith an unusually high mean, closer tothe exit scores of the other classes. Thefirst cohort of Gateway students did notshow a significant increase in scores,however, the time between testing wasabbreviated as the students not giventhe entrance test until they had been inthe program for about 6 months. Thisoversight has been corrected.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Research findings in practice

Mapped to:USA- CCNE- The Essentials ofBaccalaureate Education forProfessional Nursing Practice(2008): Essential III, Essential IV,Essential IX

Measures & Findings

Student & Alumni SatisfactionProgram level; Indirect - Survey

Details/Description: AdvisementExit InterviewsAlumni Survey

Target: Current and graduating studentsand alumni

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student & AlumniSatisfaction

Summary of Findings: Advisement:Student satisfaction with advisement andstudent satisfaction as they exit from theprogram are other indicators of programeffectiveness. The SON advises allpre-nursing students, one of our largestcommunity of interest groups.Advisement is considered an importantcomponent of University effectiveness.Block advisement is performed withtraditional students enrolled inUndergraduate Program The Pre-NursingGroup evaluates the advising processeach semester (table 4.10). TheBenchmark for Pre-Nursing studentsatisfaction with advisement is 90%.University-wide advisement of SONstudents median score on a scale of 1 to6 is consistently 6. Student satisfactionwith the Pre-Nursing Advisement processcontinued to improve and SONadvisement of all students is consideredoutstanding by the University.

Exit Interviews:As a component of ongoing continuousquality improvement, exit interviews ofstudents at the end of the fifth semesterbegan in spring 2007. Prior to this time,satisfaction with the Program was

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

10 of 37 8/23/2010 10:57 AM

Page 166: Completed Academic Program Assessment Plans 20082009

measured at one year post graduation bymail surveys. This method was ineffectivedespite many reminders; the rate ofreturn was dismal. Tables 4.11- 4.13present the Undergraduate Exit interviewquestions. Summary data of question (4):“is there anything else you would like toshare with us about your experiences atthe SON?” will be available on site.Because of the difference in studentpopulation and teaching methodology,the Gateway students complete an exitexam that measures their educationalexperience. Data will be available forreview in the Resource Room.

Alumni Survey:Alumni satisfaction with the program ismixed. For the May 08 graduates, thenumber of respondents was only 9 alumniat six months. Of those nine alumni,33-50% were dissatisfied with somemeasures but 77% were satisfied withskills and competencies acquired from theprogram. This is consistent with the exitinterviews; time did not change theirperception. Of the overall mixed resultssuch as classroom space, studentgathering area, and student in-put intothe program, we have responded andhave processes in place: the renovationof space should be complete fallsemester 2009 and we are actively tryingto involve the students in governance ofthe SON.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Student AchievementProgram level; Direct - Exam

Details/Description: ATI RN PredictorNCLEX examCollegiate Assessment of AcademicProficiency (CAAP)Graduation RateCritical Thinking

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student Achievement

Summary of Findings: ATI RN Predictor:The ATI RN-Predictor is used to measurethe likelihood of student success on firstattempt writing the NCLEX-RN exam. Thetrends of the subsections of theRN-Predictor are used to evaluate thecurriculum content and process. Themajority of students graduating inDecember 2007 and May 2008 wereenrolled in a required 3-semester creditNCLEX-preparation course, N440 in thefifth semester if they did not achieve theestablished benchmark, a score in the65th percentile or higher on the RNComprehensive Predictor. This benchmarkwas considered by ATI to be predictive ofsuccess on the NCLEX (table 4-3). TheATI recommended changing the scoringfor the Comprehensive Predictor frompercentile to percent probability ofpassing, and its recommended benchmarkwas 94% probability. The Undergraduatestudents take the exam at the end of thefourth semester, having one semesterbefore taking the NCLEX-RN. The SONcurrent benchmark is 98% probability ofpassing NCLEX-RN. Beginning withstudents in Level 5 fall semester 2008,the process changed to better utilize ourresources and student synthesis oflearning in this culminating semester.Students graduating in December 2008and May 2009 were now required to enrollin Virtual ATI (VATI) 2 weeks prior to

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

11 of 37 8/23/2010 10:57 AM

Page 167: Completed Academic Program Assessment Plans 20082009

graduation if their score on theRN-Predictor was less than thebenchmark of 98% probability of passingNCLEX. This is an online individualizedNCLEX preparation course that predicts99% probability of success. The twostudents who were not successful inDecember of 2008 on first time pass,were enrolled in VATI, but did notparticipate before taking NCLEX. This isreflected on the 2009 Education SummaryReport for the National Council of StateBoards of Nursing.

NCLEX-RN:The SON benchmark is at or above thenational percent passing of first time BSNcandidates as specified by the NationalCouncil Licensure Examination forRegistered Nurse. This is also thebenchmark used by the UT System andTHEC for program effectiveness. The 2009pass rate only includes the class thatgraduated in December 2008. As ofAugust 2009, 23 of the 24 student whograduated in May 2009 have taken theNCLEX-RN exam: all have passed on thefirst attempt.

Pass rate for first time writers of theNCLEX-RN2006 Pass Rate=92.32007 Pass Rate=87.27%2008 Pass Rate=100%2009 Pass Rate=91.6%

Graduation Rate:The graduation rate for the BSN Programis calculated by dividing the number ofstudents entering the nursing major intothe number of students graduating sevensemesters later. The benchmark is 85%.The graduation rate for 2008 was 83%(table 4.5). This occurred because 4students withdrew from the program and3 students were dismissed for academicreasons. The four students who withdrewdid so for personal reasons such ascomplicated pregnancy, husbandrelocating, and other family situations.The first cohort of the RN to BSN Trackbegan May 2007. This group of students’graduation rate is 96%.

Traditional Student UndergraduateGraduation Rates (calculation based uponseven semesters)2006=93%2007=88%2008=83%

CAAP:All undergraduate graduating seniors arerequired to take the CollegiateAssessment of Academic Proficiency(CAAP), a standardized multiple-choicetest used to measure content areasrelated to writing skills, mathematics,reading, science, and critical thinking.Aggregated results are reported.University students are randomlyassigned to take only one of the fivetests mentioned during the semesterbefore they graduate. Since the exam isgiven after the student has completed allSON courses in addition to Universitygeneral education courses, faculty believethat comparing our students withUniversity and College students is a

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

12 of 37 8/23/2010 10:57 AM

Page 168: Completed Academic Program Assessment Plans 20082009

means of gauging student learningeffectiveness.

In 2006, students graduating in thenursing major had higher mean scores inreading than other graduating students inthe College and higher means thanCollege and total University graduatingstudents in science reasoning and criticalthinking, but lower means in writing skillsand mathematics.

Critical Thinking:The SON also measures Critical Thinkingusing the ATI exam. There was nodifference in entry/exit Critical Thinkingscores for the January 2007 graduates.This is likely a regression toward themean phenomenon as this class enteredwith an unusually high mean, closer tothe exit scores of the other classes. Thefirst cohort of Gateway students did notshow a significant increase in scores,however, the time between testing wasabbreviated as the students not giventhe entrance test until they had been inthe program for about 6 months. Thisoversight has been corrected.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Communication skills

Mapped to:USA- CCNE- The Essentials ofBaccalaureate Education forProfessional Nursing Practice(2008): Essential II, Essential VI,Essential VII

Measures & Findings

Employer SatisfactionProgram level; Indirect - Survey

Details/Description: BSN EmployerSurvey

Target: Employers of BSN graduates

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Employer Satisfaction

Summary of Findings: Obtaining surveydata from employers has historically beendifficult.

Target Achievement: Not Met

Recommendations : To distributesurveys at the yearly Nursing CommunityAdvisory Committee meeting, where thereis wide representation from the agenicesemploying graduates of ourundergraduate program.

Notes :

Substantiating Evidence:

Employment RateProgram level; Indirect - Interview

Details/Description: Exit Interview

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Employment Rate

Summary of Findings: Employment Rate:Employment rate is gathered at exitinterview. Additionally, we are able tosee our graduates employed over timethrough Preceptorship faculty feedback asmany of our graduates ask to bepreceptors when qualified. All studentswho desire employment have reportedsuccess over a multi-year period.Students in the Traditional UndergraduateProgram are actively sought for positionsas nurse techs before graduation and asprofessional nurses upon graduation.

Target Achievement: Exceeded

Recommendations :

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

13 of 37 8/23/2010 10:57 AM

Page 169: Completed Academic Program Assessment Plans 20082009

Substantiating Evidence:

Student & Alumni SatisfactionProgram level; Indirect - Survey

Details/Description: AdvisementExit InterviewsAlumni Survey

Target: Current and graduating studentsand alumni

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student & AlumniSatisfaction

Summary of Findings: Advisement:Student satisfaction with advisement andstudent satisfaction as they exit from theprogram are other indicators of programeffectiveness. The SON advises allpre-nursing students, one of our largestcommunity of interest groups.Advisement is considered an importantcomponent of University effectiveness.Block advisement is performed withtraditional students enrolled inUndergraduate Program The Pre-NursingGroup evaluates the advising processeach semester (table 4.10). TheBenchmark for Pre-Nursing studentsatisfaction with advisement is 90%.University-wide advisement of SONstudents median score on a scale of 1 to6 is consistently 6. Student satisfactionwith the Pre-Nursing Advisement processcontinued to improve and SONadvisement of all students is consideredoutstanding by the University.

Exit Interviews:As a component of ongoing continuousquality improvement, exit interviews ofstudents at the end of the fifth semesterbegan in spring 2007. Prior to this time,satisfaction with the Program wasmeasured at one year post graduation bymail surveys. This method was ineffectivedespite many reminders; the rate ofreturn was dismal. Tables 4.11- 4.13present the Undergraduate Exit interviewquestions. Summary data of question (4):“is there anything else you would like toshare with us about your experiences atthe SON?” will be available on site.Because of the difference in studentpopulation and teaching methodology,the Gateway students complete an exitexam that measures their educationalexperience. Data will be available forreview in the Resource Room.

Alumni Survey:Alumni satisfaction with the program ismixed. For the May 08 graduates, thenumber of respondents was only 9 alumniat six months. Of those nine alumni,33-50% were dissatisfied with somemeasures but 77% were satisfied withskills and competencies acquired from theprogram. This is consistent with the exitinterviews; time did not change theirperception. Of the overall mixed resultssuch as classroom space, studentgathering area, and student in-put intothe program, we have responded andhave processes in place: the renovationof space should be complete fallsemester 2009 and we are actively tryingto involve the students in governance ofthe SON.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

14 of 37 8/23/2010 10:57 AM

Page 170: Completed Academic Program Assessment Plans 20082009

Student AchievementProgram level; Direct - Exam

Details/Description: ATI RN PredictorNCLEX examCollegiate Assessment of AcademicProficiency (CAAP)Graduation RateCritical Thinking

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student Achievement

Summary of Findings: ATI RN Predictor:The ATI RN-Predictor is used to measurethe likelihood of student success on firstattempt writing the NCLEX-RN exam. Thetrends of the subsections of theRN-Predictor are used to evaluate thecurriculum content and process. Themajority of students graduating inDecember 2007 and May 2008 wereenrolled in a required 3-semester creditNCLEX-preparation course, N440 in thefifth semester if they did not achieve theestablished benchmark, a score in the65th percentile or higher on the RNComprehensive Predictor. This benchmarkwas considered by ATI to be predictive ofsuccess on the NCLEX (table 4-3). TheATI recommended changing the scoringfor the Comprehensive Predictor frompercentile to percent probability ofpassing, and its recommended benchmarkwas 94% probability. The Undergraduatestudents take the exam at the end of thefourth semester, having one semesterbefore taking the NCLEX-RN. The SONcurrent benchmark is 98% probability ofpassing NCLEX-RN. Beginning withstudents in Level 5 fall semester 2008,the process changed to better utilize ourresources and student synthesis oflearning in this culminating semester.Students graduating in December 2008and May 2009 were now required to enrollin Virtual ATI (VATI) 2 weeks prior tograduation if their score on theRN-Predictor was less than thebenchmark of 98% probability of passingNCLEX. This is an online individualizedNCLEX preparation course that predicts99% probability of success. The twostudents who were not successful inDecember of 2008 on first time pass,were enrolled in VATI, but did notparticipate before taking NCLEX. This isreflected on the 2009 Education SummaryReport for the National Council of StateBoards of Nursing.

NCLEX-RN:The SON benchmark is at or above thenational percent passing of first time BSNcandidates as specified by the NationalCouncil Licensure Examination forRegistered Nurse. This is also thebenchmark used by the UT System andTHEC for program effectiveness. The 2009pass rate only includes the class thatgraduated in December 2008. As ofAugust 2009, 23 of the 24 student whograduated in May 2009 have taken theNCLEX-RN exam: all have passed on thefirst attempt.

Pass rate for first time writers of theNCLEX-RN2006 Pass Rate=92.32007 Pass Rate=87.27%2008 Pass Rate=100%2009 Pass Rate=91.6%

Graduation Rate:The graduation rate for the BSN Programis calculated by dividing the number ofstudents entering the nursing major intothe number of students graduating sevensemesters later. The benchmark is 85%.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

15 of 37 8/23/2010 10:57 AM

Page 171: Completed Academic Program Assessment Plans 20082009

The graduation rate for 2008 was 83%(table 4.5). This occurred because 4students withdrew from the program and3 students were dismissed for academicreasons. The four students who withdrewdid so for personal reasons such ascomplicated pregnancy, husbandrelocating, and other family situations.The first cohort of the RN to BSN Trackbegan May 2007. This group of students’graduation rate is 96%.

Traditional Student UndergraduateGraduation Rates (calculation based uponseven semesters)2006=93%2007=88%2008=83%

CAAP:All undergraduate graduating seniors arerequired to take the CollegiateAssessment of Academic Proficiency(CAAP), a standardized multiple-choicetest used to measure content areasrelated to writing skills, mathematics,reading, science, and critical thinking.Aggregated results are reported.University students are randomlyassigned to take only one of the fivetests mentioned during the semesterbefore they graduate. Since the exam isgiven after the student has completed allSON courses in addition to Universitygeneral education courses, faculty believethat comparing our students withUniversity and College students is ameans of gauging student learningeffectiveness.

In 2006, students graduating in thenursing major had higher mean scores inreading than other graduating students inthe College and higher means thanCollege and total University graduatingstudents in science reasoning and criticalthinking, but lower means in writing skillsand mathematics.

Critical Thinking:The SON also measures Critical Thinkingusing the ATI exam. There was nodifference in entry/exit Critical Thinkingscores for the January 2007 graduates.This is likely a regression toward themean phenomenon as this class enteredwith an unusually high mean, closer tothe exit scores of the other classes. Thefirst cohort of Gateway students did notshow a significant increase in scores,however, the time between testing wasabbreviated as the students not giventhe entrance test until they had been inthe program for about 6 months. Thisoversight has been corrected.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Ethical manner

Mapped to:USA- CCNE- The Essentials ofBaccalaureate Education for

Measures & Findings

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

16 of 37 8/23/2010 10:57 AM

Page 172: Completed Academic Program Assessment Plans 20082009

Professional Nursing Practice(2008): Essential IX , Essential VIII

Employer SatisfactionProgram level; Indirect - Survey

Details/Description: BSN EmployerSurvey

Target: Employers of BSN graduates

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Employer Satisfaction

Summary of Findings: Obtaining thisdata has been historically difficult.

Target Achievement: Not Met

Recommendations :

Notes : To distribute surveys at theyearly Nursing Community AdvisoryCommittee meeting, where there is widerepresentation from the agenicesemploying graduates of ourundergraduate program.

Substantiating Evidence:

Employment RateProgram level; Indirect - Interview

Details/Description: Exit Interview

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Employment Rate

Summary of Findings: Employment Rate:Employment rate is gathered at exitinterview. Additionally, we are able tosee our graduates employed over timethrough Preceptorship faculty feedback asmany of our graduates ask to bepreceptors when qualified. All studentswho desire employment have reportedsuccess over a multi-year period.Students in the Traditional UndergraduateProgram are actively sought for positionsas nurse techs before graduation and asprofessional nurses upon graduation.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Student & Alumni SatisfactionProgram level; Indirect - Survey

Details/Description: AdvisementExit InterviewsAlumni Survey

Target: Current and graduating studentsand alumni

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student & AlumniSatisfaction

Summary of Findings: Advisement:Student satisfaction with advisement andstudent satisfaction as they exit from theprogram are other indicators of programeffectiveness. The SON advises allpre-nursing students, one of our largestcommunity of interest groups.Advisement is considered an importantcomponent of University effectiveness.Block advisement is performed withtraditional students enrolled inUndergraduate Program The Pre-NursingGroup evaluates the advising processeach semester (table 4.10). TheBenchmark for Pre-Nursing studentsatisfaction with advisement is 90%.University-wide advisement of SONstudents median score on a scale of 1 to6 is consistently 6. Student satisfactionwith the Pre-Nursing Advisement processcontinued to improve and SONadvisement of all students is consideredoutstanding by the University.

Exit Interviews:As a component of ongoing continuousquality improvement, exit interviews ofstudents at the end of the fifth semesterbegan in spring 2007. Prior to this time,satisfaction with the Program wasmeasured at one year post graduation bymail surveys. This method was ineffectivedespite many reminders; the rate ofreturn was dismal. Tables 4.11- 4.13present the Undergraduate Exit interview

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

17 of 37 8/23/2010 10:57 AM

Page 173: Completed Academic Program Assessment Plans 20082009

questions. Summary data of question (4):“is there anything else you would like toshare with us about your experiences atthe SON?” will be available on site.Because of the difference in studentpopulation and teaching methodology,the Gateway students complete an exitexam that measures their educationalexperience. Data will be available forreview in the Resource Room.

Alumni Survey:Alumni satisfaction with the program ismixed. For the May 08 graduates, thenumber of respondents was only 9 alumniat six months. Of those nine alumni,33-50% were dissatisfied with somemeasures but 77% were satisfied withskills and competencies acquired from theprogram. This is consistent with the exitinterviews; time did not change theirperception. Of the overall mixed resultssuch as classroom space, studentgathering area, and student in-put intothe program, we have responded andhave processes in place: the renovationof space should be complete fallsemester 2009 and we are actively tryingto involve the students in governance ofthe SON.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Student AchievementProgram level; Direct - Exam

Details/Description: ATI RN PredictorNCLEX examCollegiate Assessment of AcademicProficiency (CAAP)Graduation RateCritical Thinking

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student Achievement

Summary of Findings: ATI RN Predictor:The ATI RN-Predictor is used to measurethe likelihood of student success on firstattempt writing the NCLEX-RN exam. Thetrends of the subsections of theRN-Predictor are used to evaluate thecurriculum content and process. Themajority of students graduating inDecember 2007 and May 2008 wereenrolled in a required 3-semester creditNCLEX-preparation course, N440 in thefifth semester if they did not achieve theestablished benchmark, a score in the65th percentile or higher on the RNComprehensive Predictor. This benchmarkwas considered by ATI to be predictive ofsuccess on the NCLEX (table 4-3). TheATI recommended changing the scoringfor the Comprehensive Predictor frompercentile to percent probability ofpassing, and its recommended benchmarkwas 94% probability. The Undergraduatestudents take the exam at the end of thefourth semester, having one semesterbefore taking the NCLEX-RN. The SONcurrent benchmark is 98% probability ofpassing NCLEX-RN. Beginning withstudents in Level 5 fall semester 2008,the process changed to better utilize ourresources and student synthesis oflearning in this culminating semester.Students graduating in December 2008and May 2009 were now required to enrollin Virtual ATI (VATI) 2 weeks prior tograduation if their score on theRN-Predictor was less than thebenchmark of 98% probability of passingNCLEX. This is an online individualizedNCLEX preparation course that predicts

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

18 of 37 8/23/2010 10:57 AM

Page 174: Completed Academic Program Assessment Plans 20082009

99% probability of success. The twostudents who were not successful inDecember of 2008 on first time pass,were enrolled in VATI, but did notparticipate before taking NCLEX. This isreflected on the 2009 Education SummaryReport for the National Council of StateBoards of Nursing.

NCLEX-RN:The SON benchmark is at or above thenational percent passing of first time BSNcandidates as specified by the NationalCouncil Licensure Examination forRegistered Nurse. This is also thebenchmark used by the UT System andTHEC for program effectiveness. The 2009pass rate only includes the class thatgraduated in December 2008. As ofAugust 2009, 23 of the 24 student whograduated in May 2009 have taken theNCLEX-RN exam: all have passed on thefirst attempt.

Pass rate for first time writers of theNCLEX-RN2006 Pass Rate=92.32007 Pass Rate=87.27%2008 Pass Rate=100%2009 Pass Rate=91.6%

Graduation Rate:The graduation rate for the BSN Programis calculated by dividing the number ofstudents entering the nursing major intothe number of students graduating sevensemesters later. The benchmark is 85%.The graduation rate for 2008 was 83%(table 4.5). This occurred because 4students withdrew from the program and3 students were dismissed for academicreasons. The four students who withdrewdid so for personal reasons such ascomplicated pregnancy, husbandrelocating, and other family situations.The first cohort of the RN to BSN Trackbegan May 2007. This group of students’graduation rate is 96%.

Traditional Student UndergraduateGraduation Rates (calculation based uponseven semesters)2006=93%2007=88%2008=83%

CAAP:All undergraduate graduating seniors arerequired to take the CollegiateAssessment of Academic Proficiency(CAAP), a standardized multiple-choicetest used to measure content areasrelated to writing skills, mathematics,reading, science, and critical thinking.Aggregated results are reported.University students are randomlyassigned to take only one of the fivetests mentioned during the semesterbefore they graduate. Since the exam isgiven after the student has completed allSON courses in addition to Universitygeneral education courses, faculty believethat comparing our students withUniversity and College students is ameans of gauging student learningeffectiveness.

In 2006, students graduating in thenursing major had higher mean scores in

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

19 of 37 8/23/2010 10:57 AM

Page 175: Completed Academic Program Assessment Plans 20082009

reading than other graduating students inthe College and higher means thanCollege and total University graduatingstudents in science reasoning and criticalthinking, but lower means in writing skillsand mathematics.

Critical Thinking:The SON also measures Critical Thinkingusing the ATI exam. There was nodifference in entry/exit Critical Thinkingscores for the January 2007 graduates.This is likely a regression toward themean phenomenon as this class enteredwith an unusually high mean, closer tothe exit scores of the other classes. Thefirst cohort of Gateway students did notshow a significant increase in scores,however, the time between testing wasabbreviated as the students not giventhe entrance test until they had been inthe program for about 6 months. Thisoversight has been corrected.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Evaluate own nursingpractice

Mapped to:USA- CCNE- The Essentials ofBaccalaureate Education forProfessional Nursing Practice(2008): Essential II, Essential III,Essential V , Essential VIII

Measures & Findings

Student & Alumni SatisfactionProgram level; Indirect - Survey

Details/Description: AdvisementExit InterviewsAlumni Survey

Target: Current and graduating studentsand alumni

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student & AlumniSatisfaction

Summary of Findings: Advisement:Student satisfaction with advisement andstudent satisfaction as they exit from theprogram are other indicators of programeffectiveness. The SON advises allpre-nursing students, one of our largestcommunity of interest groups.Advisement is considered an importantcomponent of University effectiveness.Block advisement is performed withtraditional students enrolled inUndergraduate Program The Pre-NursingGroup evaluates the advising processeach semester (table 4.10). TheBenchmark for Pre-Nursing studentsatisfaction with advisement is 90%.University-wide advisement of SONstudents median score on a scale of 1 to6 is consistently 6. Student satisfactionwith the Pre-Nursing Advisement processcontinued to improve and SONadvisement of all students is consideredoutstanding by the University.

Exit Interviews:As a component of ongoing continuousquality improvement, exit interviews ofstudents at the end of the fifth semesterbegan in spring 2007. Prior to this time,satisfaction with the Program wasmeasured at one year post graduation bymail surveys. This method was ineffectivedespite many reminders; the rate ofreturn was dismal. Tables 4.11- 4.13present the Undergraduate Exit interviewquestions. Summary data of question (4):“is there anything else you would like toshare with us about your experiences atthe SON?” will be available on site.Because of the difference in student

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

20 of 37 8/23/2010 10:57 AM

Page 176: Completed Academic Program Assessment Plans 20082009

population and teaching methodology,the Gateway students complete an exitexam that measures their educationalexperience. Data will be available forreview in the Resource Room.

Alumni Survey:Alumni satisfaction with the program ismixed. For the May 08 graduates, thenumber of respondents was only 9 alumniat six months. Of those nine alumni,33-50% were dissatisfied with somemeasures but 77% were satisfied withskills and competencies acquired from theprogram. This is consistent with the exitinterviews; time did not change theirperception. Of the overall mixed resultssuch as classroom space, studentgathering area, and student in-put intothe program, we have responded andhave processes in place: the renovationof space should be complete fallsemester 2009 and we are actively tryingto involve the students in governance ofthe SON.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Student AchievementProgram level; Direct - Exam

Details/Description: ATI RN PredictorNCLEX examCollegiate Assessment of AcademicProficiency (CAAP)Graduation Rate

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student Achievement

Summary of Findings: ATI RN Predictor:The ATI RN-Predictor is used to measurethe likelihood of student success on firstattempt writing the NCLEX-RN exam. Thetrends of the subsections of theRN-Predictor are used to evaluate thecurriculum content and process. Themajority of students graduating inDecember 2007 and May 2008 wereenrolled in a required 3-semester creditNCLEX-preparation course, N440 in thefifth semester if they did not achieve theestablished benchmark, a score in the65th percentile or higher on the RNComprehensive Predictor. This benchmarkwas considered by ATI to be predictive ofsuccess on the NCLEX (table 4-3). TheATI recommended changing the scoringfor the Comprehensive Predictor frompercentile to percent probability ofpassing, and its recommended benchmarkwas 94% probability. The Undergraduatestudents take the exam at the end of thefourth semester, having one semesterbefore taking the NCLEX-RN. The SONcurrent benchmark is 98% probability ofpassing NCLEX-RN. Beginning withstudents in Level 5 fall semester 2008,the process changed to better utilize ourresources and student synthesis oflearning in this culminating semester.Students graduating in December 2008and May 2009 were now required to enrollin Virtual ATI (VATI) 2 weeks prior tograduation if their score on theRN-Predictor was less than thebenchmark of 98% probability of passingNCLEX. This is an online individualizedNCLEX preparation course that predicts99% probability of success. The twostudents who were not successful inDecember of 2008 on first time pass,were enrolled in VATI, but did notparticipate before taking NCLEX. This is

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

21 of 37 8/23/2010 10:57 AM

Page 177: Completed Academic Program Assessment Plans 20082009

reflected on the 2009 Education SummaryReport for the National Council of StateBoards of Nursing.

NCLEX-RN:The SON benchmark is at or above thenational percent passing of first time BSNcandidates as specified by the NationalCouncil Licensure Examination forRegistered Nurse. This is also thebenchmark used by the UT System andTHEC for program effectiveness. The 2009pass rate only includes the class thatgraduated in December 2008. As ofAugust 2009, 23 of the 24 student whograduated in May 2009 have taken theNCLEX-RN exam: all have passed on thefirst attempt.

Pass rate for first time writers of theNCLEX-RN2006 Pass Rate=92.32007 Pass Rate=87.27%2008 Pass Rate=100%2009 Pass Rate=91.6%

Graduation Rate:The graduation rate for the BSN Programis calculated by dividing the number ofstudents entering the nursing major intothe number of students graduating sevensemesters later. The benchmark is 85%.The graduation rate for 2008 was 83%(table 4.5). This occurred because 4students withdrew from the program and3 students were dismissed for academicreasons. The four students who withdrewdid so for personal reasons such ascomplicated pregnancy, husbandrelocating, and other family situations.The first cohort of the RN to BSN Trackbegan May 2007. This group of students’graduation rate is 96%.

Traditional Student UndergraduateGraduation Rates (calculation based uponseven semesters)2006=93%2007=88%2008=83%

CAAP:All undergraduate graduating seniors arerequired to take the CollegiateAssessment of Academic Proficiency(CAAP), a standardized multiple-choicetest used to measure content areasrelated to writing skills, mathematics,reading, science, and critical thinking.Aggregated results are reported.University students are randomlyassigned to take only one of the fivetests mentioned during the semesterbefore they graduate. Since the exam isgiven after the student has completed allSON courses in addition to Universitygeneral education courses, faculty believethat comparing our students withUniversity and College students is ameans of gauging student learningeffectiveness.

In 2006, students graduating in thenursing major had higher mean scores inreading than other graduating students inthe College and higher means thanCollege and total University graduatingstudents in science reasoning and criticalthinking, but lower means in writing skills

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

22 of 37 8/23/2010 10:57 AM

Page 178: Completed Academic Program Assessment Plans 20082009

and mathematics.

Critical Thinking:The SON also measures Critical Thinkingusing the ATI exam. There was nodifference in entry/exit Critical Thinkingscores for the January 2007 graduates.This is likely a regression toward themean phenomenon as this class enteredwith an unusually high mean, closer tothe exit scores of the other classes. Thefirst cohort of Gateway students did notshow a significant increase in scores,however, the time between testing wasabbreviated as the students not giventhe entrance test until they had been inthe program for about 6 months. Thisoversight has been corrected.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Life-long learning.

Mapped to:USA- CCNE- The Essentials ofBaccalaureate Education forProfessional Nursing Practice(2008): Essential IX , Essential VII

Measures & Findings

Student & Alumni SatisfactionProgram level; Indirect - Survey

Details/Description: AdvisementExit InterviewsAlumni Survey

Target: Current and graduating studentsand alumni

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student & AlumniSatisfaction

Summary of Findings: Advisement:Student satisfaction with advisement andstudent satisfaction as they exit from theprogram are other indicators of programeffectiveness. The SON advises allpre-nursing students, one of our largestcommunity of interest groups.Advisement is considered an importantcomponent of University effectiveness.Block advisement is performed withtraditional students enrolled inUndergraduate Program The Pre-NursingGroup evaluates the advising processeach semester (table 4.10). TheBenchmark for Pre-Nursing studentsatisfaction with advisement is 90%.University-wide advisement of SONstudents median score on a scale of 1 to6 is consistently 6. Student satisfactionwith the Pre-Nursing Advisement processcontinued to improve and SONadvisement of all students is consideredoutstanding by the University.

Exit Interviews:As a component of ongoing continuousquality improvement, exit interviews ofstudents at the end of the fifth semesterbegan in spring 2007. Prior to this time,satisfaction with the Program wasmeasured at one year post graduation bymail surveys. This method was ineffectivedespite many reminders; the rate ofreturn was dismal. Tables 4.11- 4.13present the Undergraduate Exit interviewquestions. Summary data of question (4):“is there anything else you would like toshare with us about your experiences atthe SON?” will be available on site.Because of the difference in studentpopulation and teaching methodology,the Gateway students complete an exitexam that measures their educationalexperience. Data will be available forreview in the Resource Room.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

23 of 37 8/23/2010 10:57 AM

Page 179: Completed Academic Program Assessment Plans 20082009

Alumni Survey:Alumni satisfaction with the program ismixed. For the May 08 graduates, thenumber of respondents was only 9 alumniat six months. Of those nine alumni,33-50% were dissatisfied with somemeasures but 77% were satisfied withskills and competencies acquired from theprogram. This is consistent with the exitinterviews; time did not change theirperception. Of the overall mixed resultssuch as classroom space, studentgathering area, and student in-put intothe program, we have responded andhave processes in place: the renovationof space should be complete fallsemester 2009 and we are actively tryingto involve the students in governance ofthe SON.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Student AchievementProgram level; Direct - Exam

Details/Description: ATI RN PredictorNCLEX examCollegiate Assessment of AcademicProficiency (CAAP)Graduation RateCritical Thinking

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

No Findings Added to StudentAchievement

Value individual differences

Mapped to:USA- CCNE- The Essentials ofBaccalaureate Education forProfessional Nursing Practice(2008): Essential I, Essential VII

Measures & Findings

Student & Alumni SatisfactionProgram level; Indirect - Survey

Details/Description: AdvisementExit InterviewsAlumni Survey

Target: Current and graduating studentsand alumni

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student & AlumniSatisfaction

Summary of Findings: Advisement:Student satisfaction with advisement andstudent satisfaction as they exit from theprogram are other indicators of programeffectiveness. The SON advises allpre-nursing students, one of our largestcommunity of interest groups.Advisement is considered an importantcomponent of University effectiveness.Block advisement is performed withtraditional students enrolled inUndergraduate Program The Pre-NursingGroup evaluates the advising processeach semester (table 4.10). TheBenchmark for Pre-Nursing studentsatisfaction with advisement is 90%.University-wide advisement of SONstudents median score on a scale of 1 to6 is consistently 6. Student satisfactionwith the Pre-Nursing Advisement processcontinued to improve and SONadvisement of all students is consideredoutstanding by the University.

Exit Interviews:As a component of ongoing continuousquality improvement, exit interviews of

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

24 of 37 8/23/2010 10:57 AM

Page 180: Completed Academic Program Assessment Plans 20082009

students at the end of the fifth semesterbegan in spring 2007. Prior to this time,satisfaction with the Program wasmeasured at one year post graduation bymail surveys. This method was ineffectivedespite many reminders; the rate ofreturn was dismal. Tables 4.11- 4.13present the Undergraduate Exit interviewquestions. Summary data of question (4):“is there anything else you would like toshare with us about your experiences atthe SON?” will be available on site.Because of the difference in studentpopulation and teaching methodology,the Gateway students complete an exitexam that measures their educationalexperience. Data will be available forreview in the Resource Room.

Alumni Survey:Alumni satisfaction with the program ismixed. For the May 08 graduates, thenumber of respondents was only 9 alumniat six months. Of those nine alumni,33-50% were dissatisfied with somemeasures but 77% were satisfied withskills and competencies acquired from theprogram. This is consistent with the exitinterviews; time did not change theirperception. Of the overall mixed resultssuch as classroom space, studentgathering area, and student in-put intothe program, we have responded andhave processes in place: the renovationof space should be complete fallsemester 2009 and we are actively tryingto involve the students in governance ofthe SON.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Student AchievementProgram level; Direct - Exam

Details/Description: ATI RN PredictorNCLEX examCollegiate Assessment of AcademicProficiency (CAAP)Graduation RateCritical Thinking

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student Achievement

Summary of Findings: ATI RN Predictor:The ATI RN-Predictor is used to measurethe likelihood of student success on firstattempt writing the NCLEX-RN exam. Thetrends of the subsections of theRN-Predictor are used to evaluate thecurriculum content and process. Themajority of students graduating inDecember 2007 and May 2008 wereenrolled in a required 3-semester creditNCLEX-preparation course, N440 in thefifth semester if they did not achieve theestablished benchmark, a score in the65th percentile or higher on the RNComprehensive Predictor. This benchmarkwas considered by ATI to be predictive ofsuccess on the NCLEX (table 4-3). TheATI recommended changing the scoringfor the Comprehensive Predictor frompercentile to percent probability ofpassing, and its recommended benchmarkwas 94% probability. The Undergraduatestudents take the exam at the end of thefourth semester, having one semesterbefore taking the NCLEX-RN. The SONcurrent benchmark is 98% probability ofpassing NCLEX-RN. Beginning withstudents in Level 5 fall semester 2008,the process changed to better utilize ourresources and student synthesis oflearning in this culminating semester.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

25 of 37 8/23/2010 10:57 AM

Page 181: Completed Academic Program Assessment Plans 20082009

Students graduating in December 2008and May 2009 were now required to enrollin Virtual ATI (VATI) 2 weeks prior tograduation if their score on theRN-Predictor was less than thebenchmark of 98% probability of passingNCLEX. This is an online individualizedNCLEX preparation course that predicts99% probability of success. The twostudents who were not successful inDecember of 2008 on first time pass,were enrolled in VATI, but did notparticipate before taking NCLEX. This isreflected on the 2009 Education SummaryReport for the National Council of StateBoards of Nursing.

NCLEX-RN:The SON benchmark is at or above thenational percent passing of first time BSNcandidates as specified by the NationalCouncil Licensure Examination forRegistered Nurse. This is also thebenchmark used by the UT System andTHEC for program effectiveness. The 2009pass rate only includes the class thatgraduated in December 2008. As ofAugust 2009, 23 of the 24 student whograduated in May 2009 have taken theNCLEX-RN exam: all have passed on thefirst attempt.

Pass rate for first time writers of theNCLEX-RN2006 Pass Rate=92.32007 Pass Rate=87.27%2008 Pass Rate=100%2009 Pass Rate=91.6%

Graduation Rate:The graduation rate for the BSN Programis calculated by dividing the number ofstudents entering the nursing major intothe number of students graduating sevensemesters later. The benchmark is 85%.The graduation rate for 2008 was 83%(table 4.5). This occurred because 4students withdrew from the program and3 students were dismissed for academicreasons. The four students who withdrewdid so for personal reasons such ascomplicated pregnancy, husbandrelocating, and other family situations.The first cohort of the RN to BSN Trackbegan May 2007. This group of students’graduation rate is 96%.

Traditional Student UndergraduateGraduation Rates (calculation based uponseven semesters)2006=93%2007=88%2008=83%

CAAP:All undergraduate graduating seniors arerequired to take the CollegiateAssessment of Academic Proficiency(CAAP), a standardized multiple-choicetest used to measure content areasrelated to writing skills, mathematics,reading, science, and critical thinking.Aggregated results are reported.University students are randomlyassigned to take only one of the fivetests mentioned during the semesterbefore they graduate. Since the exam isgiven after the student has completed allSON courses in addition to University

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

26 of 37 8/23/2010 10:57 AM

Page 182: Completed Academic Program Assessment Plans 20082009

general education courses, faculty believethat comparing our students withUniversity and College students is ameans of gauging student learningeffectiveness.

In 2006, students graduating in thenursing major had higher mean scores inreading than other graduating students inthe College and higher means thanCollege and total University graduatingstudents in science reasoning and criticalthinking, but lower means in writing skillsand mathematics.

Critical Thinking:The SON also measures Critical Thinkingusing the ATI exam. There was nodifference in entry/exit Critical Thinkingscores for the January 2007 graduates.This is likely a regression toward themean phenomenon as this class enteredwith an unusually high mean, closer tothe exit scores of the other classes. Thefirst cohort of Gateway students did notshow a significant increase in scores,however, the time between testing wasabbreviated as the students not giventhe entrance test until they had been inthe program for about 6 months. Thisoversight has been corrected.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Demonstrate critical thinkingskills.

Mapped to:USA- CCNE- The Essentials ofBaccalaureate Education forProfessional Nursing Practice(2008): Essential III

Measures & Findings

Employer SatisfactionProgram level; Indirect - Survey

Details/Description: BSN EmployerSurvey

Target: Employers of BSN graduates

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Employer Satisfaction

Summary of Findings: Obtaining thisdata has been historically difficult.

Target Achievement: Not Met

Recommendations :

Notes : To distribute surveys at theyearly Nursing Community AdvisoryCommittee meeting, where there is widerepresentation from the agenicesemploying graduates of ourundergraduate program.

Substantiating Evidence:

Employment RateProgram level; Indirect - Interview

Details/Description: Exit Interview

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Employment Rate

Summary of Findings: Employment Rate:Employment rate is gathered at exitinterview. Additionally, we are able tosee our graduates employed over timethrough Preceptorship faculty feedback asmany of our graduates ask to bepreceptors when qualified. All studentswho desire employment have reportedsuccess over a multi-year period.Students in the Traditional UndergraduateProgram are actively sought for positionsas nurse techs before graduation and asprofessional nurses upon graduation.

Target Achievement: Exceeded

Recommendations :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

27 of 37 8/23/2010 10:57 AM

Page 183: Completed Academic Program Assessment Plans 20082009

Notes :

Substantiating Evidence:

Student & Alumni SatisfactionProgram level; Indirect - Survey

Details/Description: AdvisementExit InterviewsAlumni Survey

Target: Current and graduating studentsand alumni

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student & AlumniSatisfaction

Summary of Findings: Advisement:Student satisfaction with advisement andstudent satisfaction as they exit from theprogram are other indicators of programeffectiveness. The SON advises allpre-nursing students, one of our largestcommunity of interest groups.Advisement is considered an importantcomponent of University effectiveness.Block advisement is performed withtraditional students enrolled inUndergraduate Program The Pre-NursingGroup evaluates the advising processeach semester (table 4.10). TheBenchmark for Pre-Nursing studentsatisfaction with advisement is 90%.University-wide advisement of SONstudents median score on a scale of 1 to6 is consistently 6. Student satisfactionwith the Pre-Nursing Advisement processcontinued to improve and SONadvisement of all students is consideredoutstanding by the University.

Exit Interviews:As a component of ongoing continuousquality improvement, exit interviews ofstudents at the end of the fifth semesterbegan in spring 2007. Prior to this time,satisfaction with the Program wasmeasured at one year post graduation bymail surveys. This method was ineffectivedespite many reminders; the rate ofreturn was dismal. Tables 4.11- 4.13present the Undergraduate Exit interviewquestions. Summary data of question (4):“is there anything else you would like toshare with us about your experiences atthe SON?” will be available on site.Because of the difference in studentpopulation and teaching methodology,the Gateway students complete an exitexam that measures their educationalexperience. Data will be available forreview in the Resource Room.

Alumni Survey:Alumni satisfaction with the program ismixed. For the May 08 graduates, thenumber of respondents was only 9 alumniat six months. Of those nine alumni,33-50% were dissatisfied with somemeasures but 77% were satisfied withskills and competencies acquired from theprogram. This is consistent with the exitinterviews; time did not change theirperception. Of the overall mixed resultssuch as classroom space, studentgathering area, and student in-put intothe program, we have responded andhave processes in place: the renovationof space should be complete fallsemester 2009 and we are actively tryingto involve the students in governance ofthe SON.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

28 of 37 8/23/2010 10:57 AM

Page 184: Completed Academic Program Assessment Plans 20082009

Student AchievementProgram level; Direct - Exam

Details/Description: ATI RN PredictorNCLEX examCollegiate Assessment of AcademicProficiency (CAAP)Graduation RateCritical Thinking

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student Achievement

Summary of Findings: ATI RN Predictor:The ATI RN-Predictor is used to measurethe likelihood of student success on firstattempt writing the NCLEX-RN exam. Thetrends of the subsections of theRN-Predictor are used to evaluate thecurriculum content and process. Themajority of students graduating inDecember 2007 and May 2008 wereenrolled in a required 3-semester creditNCLEX-preparation course, N440 in thefifth semester if they did not achieve theestablished benchmark, a score in the65th percentile or higher on the RNComprehensive Predictor. This benchmarkwas considered by ATI to be predictive ofsuccess on the NCLEX (table 4-3). TheATI recommended changing the scoringfor the Comprehensive Predictor frompercentile to percent probability ofpassing, and its recommended benchmarkwas 94% probability. The Undergraduatestudents take the exam at the end of thefourth semester, having one semesterbefore taking the NCLEX-RN. The SONcurrent benchmark is 98% probability ofpassing NCLEX-RN. Beginning withstudents in Level 5 fall semester 2008,the process changed to better utilize ourresources and student synthesis oflearning in this culminating semester.Students graduating in December 2008and May 2009 were now required to enrollin Virtual ATI (VATI) 2 weeks prior tograduation if their score on theRN-Predictor was less than thebenchmark of 98% probability of passingNCLEX. This is an online individualizedNCLEX preparation course that predicts99% probability of success. The twostudents who were not successful inDecember of 2008 on first time pass,were enrolled in VATI, but did notparticipate before taking NCLEX. This isreflected on the 2009 Education SummaryReport for the National Council of StateBoards of Nursing.

NCLEX-RN:The SON benchmark is at or above thenational percent passing of first time BSNcandidates as specified by the NationalCouncil Licensure Examination forRegistered Nurse. This is also thebenchmark used by the UT System andTHEC for program effectiveness. The 2009pass rate only includes the class thatgraduated in December 2008. As ofAugust 2009, 23 of the 24 student whograduated in May 2009 have taken theNCLEX-RN exam: all have passed on thefirst attempt.

Pass rate for first time writers of theNCLEX-RN2006 Pass Rate=92.32007 Pass Rate=87.27%2008 Pass Rate=100%2009 Pass Rate=91.6%

Graduation Rate:The graduation rate for the BSN Programis calculated by dividing the number ofstudents entering the nursing major intothe number of students graduating sevensemesters later. The benchmark is 85%.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

29 of 37 8/23/2010 10:57 AM

Page 185: Completed Academic Program Assessment Plans 20082009

The graduation rate for 2008 was 83%(table 4.5). This occurred because 4students withdrew from the program and3 students were dismissed for academicreasons. The four students who withdrewdid so for personal reasons such ascomplicated pregnancy, husbandrelocating, and other family situations.The first cohort of the RN to BSN Trackbegan May 2007. This group of students’graduation rate is 96%.

Traditional Student UndergraduateGraduation Rates (calculation based uponseven semesters)2006=93%2007=88%2008=83%

CAAP:All undergraduate graduating seniors arerequired to take the CollegiateAssessment of Academic Proficiency(CAAP), a standardized multiple-choicetest used to measure content areasrelated to writing skills, mathematics,reading, science, and critical thinking.Aggregated results are reported.University students are randomlyassigned to take only one of the fivetests mentioned during the semesterbefore they graduate. Since the exam isgiven after the student has completed allSON courses in addition to Universitygeneral education courses, faculty believethat comparing our students withUniversity and College students is ameans of gauging student learningeffectiveness.

In 2006, students graduating in thenursing major had higher mean scores inreading than other graduating students inthe College and higher means thanCollege and total University graduatingstudents in science reasoning and criticalthinking, but lower means in writing skillsand mathematics.

Critical Thinking:The SON also measures Critical Thinkingusing the ATI exam. There was nodifference in entry/exit Critical Thinkingscores for the January 2007 graduates.This is likely a regression toward themean phenomenon as this class enteredwith an unusually high mean, closer tothe exit scores of the other classes. Thefirst cohort of Gateway students did notshow a significant increase in scores,however, the time between testing wasabbreviated as the students not giventhe entrance test until they had been inthe program for about 6 months. Thisoversight has been corrected.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Professional behavior.

Mapped to:USA- CCNE- The Essentials ofBaccalaureate Education for

Measures & Findings

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

30 of 37 8/23/2010 10:57 AM

Page 186: Completed Academic Program Assessment Plans 20082009

Professional Nursing Practice(2008): Essential VI, Essential VIII

Employer SatisfactionProgram level; Indirect - Survey

Details/Description: BSN EmployerSurvey

Target: Employers of BSN graduates

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Employer Satisfaction

Summary of Findings: Obtaining thisdata has been historically difficult.

Target Achievement: Not Met

Recommendations : To distributesurveys at the yearly Nursing CommunityAdvisory Committee meeting, where thereis wide representation from the agenicesemploying graduates of ourundergraduate program.

Notes :

Substantiating Evidence:

Employment RateProgram level; Indirect - Interview

Details/Description: Exit Interview

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Employment Rate

Summary of Findings: Employment Rate:Employment rate is gathered at exitinterview. Additionally, we are able tosee our graduates employed over timethrough Preceptorship faculty feedback asmany of our graduates ask to bepreceptors when qualified. All studentswho desire employment have reportedsuccess over a multi-year period.Students in the Traditional UndergraduateProgram are actively sought for positionsas nurse techs before graduation and asprofessional nurses upon graduation.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Student & Alumni SatisfactionProgram level; Indirect - Survey

Details/Description: AdvisementExit InterviewsAlumni Survey

Target: Current and graduating studentsand alumni

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student & AlumniSatisfaction

Summary of Findings: Advisement:Student satisfaction with advisement andstudent satisfaction as they exit from theprogram are other indicators of programeffectiveness. The SON advises allpre-nursing students, one of our largestcommunity of interest groups.Advisement is considered an importantcomponent of University effectiveness.Block advisement is performed withtraditional students enrolled inUndergraduate Program The Pre-NursingGroup evaluates the advising processeach semester (table 4.10). TheBenchmark for Pre-Nursing studentsatisfaction with advisement is 90%.University-wide advisement of SONstudents median score on a scale of 1 to6 is consistently 6. Student satisfactionwith the Pre-Nursing Advisement processcontinued to improve and SONadvisement of all students is consideredoutstanding by the University.

Exit Interviews:As a component of ongoing continuousquality improvement, exit interviews ofstudents at the end of the fifth semesterbegan in spring 2007. Prior to this time,satisfaction with the Program wasmeasured at one year post graduation bymail surveys. This method was ineffectivedespite many reminders; the rate ofreturn was dismal. Tables 4.11- 4.13present the Undergraduate Exit interview

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

31 of 37 8/23/2010 10:57 AM

Page 187: Completed Academic Program Assessment Plans 20082009

questions. Summary data of question (4):“is there anything else you would like toshare with us about your experiences atthe SON?” will be available on site.Because of the difference in studentpopulation and teaching methodology,the Gateway students complete an exitexam that measures their educationalexperience. Data will be available forreview in the Resource Room.

Alumni Survey:Alumni satisfaction with the program ismixed. For the May 08 graduates, thenumber of respondents was only 9 alumniat six months. Of those nine alumni,33-50% were dissatisfied with somemeasures but 77% were satisfied withskills and competencies acquired from theprogram. This is consistent with the exitinterviews; time did not change theirperception. Of the overall mixed resultssuch as classroom space, studentgathering area, and student in-put intothe program, we have responded andhave processes in place: the renovationof space should be complete fallsemester 2009 and we are actively tryingto involve the students in governance ofthe SON.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Student AchievementProgram level; Direct - Exam

Details/Description: ATI RN PredictorNCLEX examCollegiate Assessment of AcademicProficiency (CAAP)Graduation RateCritical Thinking

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Student Achievement

Summary of Findings: ATI RN Predictor:The ATI RN-Predictor is used to measurethe likelihood of student success on firstattempt writing the NCLEX-RN exam. Thetrends of the subsections of theRN-Predictor are used to evaluate thecurriculum content and process. Themajority of students graduating inDecember 2007 and May 2008 wereenrolled in a required 3-semester creditNCLEX-preparation course, N440 in thefifth semester if they did not achieve theestablished benchmark, a score in the65th percentile or higher on the RNComprehensive Predictor. This benchmarkwas considered by ATI to be predictive ofsuccess on the NCLEX (table 4-3). TheATI recommended changing the scoringfor the Comprehensive Predictor frompercentile to percent probability ofpassing, and its recommended benchmarkwas 94% probability. The Undergraduatestudents take the exam at the end of thefourth semester, having one semesterbefore taking the NCLEX-RN. The SONcurrent benchmark is 98% probability ofpassing NCLEX-RN. Beginning withstudents in Level 5 fall semester 2008,the process changed to better utilize ourresources and student synthesis oflearning in this culminating semester.Students graduating in December 2008and May 2009 were now required to enrollin Virtual ATI (VATI) 2 weeks prior tograduation if their score on theRN-Predictor was less than thebenchmark of 98% probability of passingNCLEX. This is an online individualizedNCLEX preparation course that predicts

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

32 of 37 8/23/2010 10:57 AM

Page 188: Completed Academic Program Assessment Plans 20082009

99% probability of success. The twostudents who were not successful inDecember of 2008 on first time pass,were enrolled in VATI, but did notparticipate before taking NCLEX. This isreflected on the 2009 Education SummaryReport for the National Council of StateBoards of Nursing.

NCLEX-RN:The SON benchmark is at or above thenational percent passing of first time BSNcandidates as specified by the NationalCouncil Licensure Examination forRegistered Nurse. This is also thebenchmark used by the UT System andTHEC for program effectiveness. The 2009pass rate only includes the class thatgraduated in December 2008. As ofAugust 2009, 23 of the 24 student whograduated in May 2009 have taken theNCLEX-RN exam: all have passed on thefirst attempt.

Pass rate for first time writers of theNCLEX-RN2006 Pass Rate=92.32007 Pass Rate=87.27%2008 Pass Rate=100%2009 Pass Rate=91.6%

Graduation Rate:The graduation rate for the BSN Programis calculated by dividing the number ofstudents entering the nursing major intothe number of students graduating sevensemesters later. The benchmark is 85%.The graduation rate for 2008 was 83%(table 4.5). This occurred because 4students withdrew from the program and3 students were dismissed for academicreasons. The four students who withdrewdid so for personal reasons such ascomplicated pregnancy, husbandrelocating, and other family situations.The first cohort of the RN to BSN Trackbegan May 2007. This group of students’graduation rate is 96%.

Traditional Student UndergraduateGraduation Rates (calculation based uponseven semesters)2006=93%2007=88%2008=83%

CAAP:All undergraduate graduating seniors arerequired to take the CollegiateAssessment of Academic Proficiency(CAAP), a standardized multiple-choicetest used to measure content areasrelated to writing skills, mathematics,reading, science, and critical thinking.Aggregated results are reported.University students are randomlyassigned to take only one of the fivetests mentioned during the semesterbefore they graduate. Since the exam isgiven after the student has completed allSON courses in addition to Universitygeneral education courses, faculty believethat comparing our students withUniversity and College students is ameans of gauging student learningeffectiveness.

In 2006, students graduating in thenursing major had higher mean scores in

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

33 of 37 8/23/2010 10:57 AM

Page 189: Completed Academic Program Assessment Plans 20082009

reading than other graduating students inthe College and higher means thanCollege and total University graduatingstudents in science reasoning and criticalthinking, but lower means in writing skillsand mathematics.

Critical Thinking:The SON also measures Critical Thinkingusing the ATI exam. There was nodifference in entry/exit Critical Thinkingscores for the January 2007 graduates.This is likely a regression toward themean phenomenon as this class enteredwith an unusually high mean, closer tothe exit scores of the other classes. Thefirst cohort of Gateway students did notshow a significant increase in scores,however, the time between testing wasabbreviated as the students not giventhe entrance test until they had been inthe program for about 6 months. Thisoversight has been corrected.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Resource Management

Mapped to:USA- CCNE- The Essentials ofBaccalaureate Education forProfessional Nursing Practice(2008): Essential III, Essential IV,Essential V

Measures & Findings

Employer SatisfactionProgram level; Indirect - Survey

Details/Description: BSN EmployerSurvey

Target: Employers of BSN graduates

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Employer Satisfaction

Summary of Findings: Obtaining thisdata has been historically difficult.

Target Achievement: Not Met

Recommendations : To distributesurveys at the yearly Nursing CommunityAdvisory Committee meeting, where thereis wide representation from the agenicesemploying graduates of ourundergraduate program.

Notes :

Substantiating Evidence:

Employment RateProgram level; Indirect - Interview

Details/Description: Exit Interview

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Findings for Employment Rate

Summary of Findings: Employment Rate:Employment rate is gathered at exitinterview. Additionally, we are able tosee our graduates employed over timethrough Preceptorship faculty feedback asmany of our graduates ask to bepreceptors when qualified. All studentswho desire employment have reportedsuccess over a multi-year period.Students in the Traditional UndergraduateProgram are actively sought for positionsas nurse techs before graduation and asprofessional nurses upon graduation.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Student & Alumni SatisfactionProgram level; Indirect - Survey

Details/Description: AdvisementExit Interviews

Findings for Student & AlumniSatisfaction

Summary of Findings: Advisement:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

34 of 37 8/23/2010 10:57 AM

Page 190: Completed Academic Program Assessment Plans 20082009

Alumni Survey

Target: Current and graduating studentsand alumni

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

Student satisfaction with advisement andstudent satisfaction as they exit from theprogram are other indicators of programeffectiveness. The SON advises allpre-nursing students, one of our largestcommunity of interest groups.Advisement is considered an importantcomponent of University effectiveness.Block advisement is performed withtraditional students enrolled inUndergraduate Program The Pre-NursingGroup evaluates the advising processeach semester (table 4.10). TheBenchmark for Pre-Nursing studentsatisfaction with advisement is 90%.University-wide advisement of SONstudents median score on a scale of 1 to6 is consistently 6. Student satisfactionwith the Pre-Nursing Advisement processcontinued to improve and SONadvisement of all students is consideredoutstanding by the University.

Exit Interviews:As a component of ongoing continuousquality improvement, exit interviews ofstudents at the end of the fifth semesterbegan in spring 2007. Prior to this time,satisfaction with the Program wasmeasured at one year post graduation bymail surveys. This method was ineffectivedespite many reminders; the rate ofreturn was dismal. Tables 4.11- 4.13present the Undergraduate Exit interviewquestions. Summary data of question (4):“is there anything else you would like toshare with us about your experiences atthe SON?” will be available on site.Because of the difference in studentpopulation and teaching methodology,the Gateway students complete an exitexam that measures their educationalexperience. Data will be available forreview in the Resource Room.

Alumni Survey:Alumni satisfaction with the program ismixed. For the May 08 graduates, thenumber of respondents was only 9 alumniat six months. Of those nine alumni,33-50% were dissatisfied with somemeasures but 77% were satisfied withskills and competencies acquired from theprogram. This is consistent with the exitinterviews; time did not change theirperception. Of the overall mixed resultssuch as classroom space, studentgathering area, and student in-put intothe program, we have responded andhave processes in place: the renovationof space should be complete fallsemester 2009 and we are actively tryingto involve the students in governance ofthe SON.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Student AchievementProgram level; Direct - Exam

Details/Description: ATI RN PredictorNCLEX examCollegiate Assessment of AcademicProficiency (CAAP)Graduation Rate

Findings for Student Achievement

Summary of Findings: ATI RN Predictor:The ATI RN-Predictor is used to measurethe likelihood of student success on firstattempt writing the NCLEX-RN exam. Thetrends of the subsections of the

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

35 of 37 8/23/2010 10:57 AM

Page 191: Completed Academic Program Assessment Plans 20082009

Critical Thinking

Target: Graduating BSN students

Implementation Plan (timeline): Current

Key/Responsible Personnel: KatherineLindgren

Supporting Attachments:

RN-Predictor are used to evaluate thecurriculum content and process. Themajority of students graduating inDecember 2007 and May 2008 wereenrolled in a required 3-semester creditNCLEX-preparation course, N440 in thefifth semester if they did not achieve theestablished benchmark, a score in the65th percentile or higher on the RNComprehensive Predictor. This benchmarkwas considered by ATI to be predictive ofsuccess on the NCLEX (table 4-3). TheATI recommended changing the scoringfor the Comprehensive Predictor frompercentile to percent probability ofpassing, and its recommended benchmarkwas 94% probability. The Undergraduatestudents take the exam at the end of thefourth semester, having one semesterbefore taking the NCLEX-RN. The SONcurrent benchmark is 98% probability ofpassing NCLEX-RN. Beginning withstudents in Level 5 fall semester 2008,the process changed to better utilize ourresources and student synthesis oflearning in this culminating semester.Students graduating in December 2008and May 2009 were now required to enrollin Virtual ATI (VATI) 2 weeks prior tograduation if their score on theRN-Predictor was less than thebenchmark of 98% probability of passingNCLEX. This is an online individualizedNCLEX preparation course that predicts99% probability of success. The twostudents who were not successful inDecember of 2008 on first time pass,were enrolled in VATI, but did notparticipate before taking NCLEX. This isreflected on the 2009 Education SummaryReport for the National Council of StateBoards of Nursing.

NCLEX-RN:The SON benchmark is at or above thenational percent passing of first time BSNcandidates as specified by the NationalCouncil Licensure Examination forRegistered Nurse. This is also thebenchmark used by the UT System andTHEC for program effectiveness. The 2009pass rate only includes the class thatgraduated in December 2008. As ofAugust 2009, 23 of the 24 student whograduated in May 2009 have taken theNCLEX-RN exam: all have passed on thefirst attempt.

Pass rate for first time writers of theNCLEX-RN2006 Pass Rate=92.32007 Pass Rate=87.27%2008 Pass Rate=100%2009 Pass Rate=91.6%

Graduation Rate:The graduation rate for the BSN Programis calculated by dividing the number ofstudents entering the nursing major intothe number of students graduating sevensemesters later. The benchmark is 85%.The graduation rate for 2008 was 83%(table 4.5). This occurred because 4students withdrew from the program and3 students were dismissed for academicreasons. The four students who withdrewdid so for personal reasons such ascomplicated pregnancy, husbandrelocating, and other family situations.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

36 of 37 8/23/2010 10:57 AM

Page 192: Completed Academic Program Assessment Plans 20082009

The first cohort of the RN to BSN Trackbegan May 2007. This group of students’graduation rate is 96%.

Traditional Student UndergraduateGraduation Rates (calculation based uponseven semesters)2006=93%2007=88%2008=83%

CAAP:All undergraduate graduating seniors arerequired to take the CollegiateAssessment of Academic Proficiency(CAAP), a standardized multiple-choicetest used to measure content areasrelated to writing skills, mathematics,reading, science, and critical thinking.Aggregated results are reported.University students are randomlyassigned to take only one of the fivetests mentioned during the semesterbefore they graduate. Since the exam isgiven after the student has completed allSON courses in addition to Universitygeneral education courses, faculty believethat comparing our students withUniversity and College students is ameans of gauging student learningeffectiveness.

In 2006, students graduating in thenursing major had higher mean scores inreading than other graduating students inthe College and higher means thanCollege and total University graduatingstudents in science reasoning and criticalthinking, but lower means in writing skillsand mathematics.

Critical Thinking:The SON also measures Critical Thinkingusing the ATI exam. There was nodifference in entry/exit Critical Thinkingscores for the January 2007 graduates.This is likely a regression toward themean phenomenon as this class enteredwith an unusually high mean, closer tothe exit scores of the other classes. Thefirst cohort of Gateway students did notshow a significant increase in scores,however, the time between testing wasabbreviated as the students not giventhe entrance test until they had been inthe program for about 6 months. Thisoversight has been corrected.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_ass...

37 of 37 8/23/2010 10:57 AM

Page 193: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Physical Therapy: DPT

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Entry-Level DPT

Outcomes

NPTE Pass Rate

Mapped to:USA- CAPTE- Standards: Element CO-1, ElementCO-3

Measures & Findings

NPTE Pass RateProgram level; Direct - Exam

Details/Description: This objective is measured byreviewing the pass rate report provided by the FSBPT.

Target: equal to or higher than the national average forpass rates for first time takers who are graduates of USPT programs

Implementation Plan (timeline): Annual

Key/Responsible Personnel: DepartmentHead/Accreditation Committee

Supporting Attachments:

Findings for NPTE Pass Rate

Summary of Findings: 100% first-time pass rate;National average 85.92%.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:2008 pass rates (MHTML)

Safe Practice

Mapped to:USA- CAPTE- Standards: Element CO-1, ElementCO-2, Element CO-4

Measures & Findings

Clinical Performance Instrument Safe PracticeProgram level; Indirect - Other

Details/Description: Clinical Performance Instrument(PHYT 790 Clinical Internship) results as submitted byclinical instructors.

Target: 100%

Implementation Plan (timeline): Annual

Key/Responsible Personnel: Director of ClinicalEducation

Supporting Attachments:APTA PT Clinical Performance Instrument (Web

Link)Link to APTA Physical Therapist Clinical Performance

Findings for Clinical Performance Instrument SafePractice

Summary of Findings: All students met CPI indicator#1 at entry-level at point of graduation.

Target Achievement: Met

Recommendations :

Notes : Hard copies of individual student CPI recordsavailable upon request in the PT department.

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 3 8/6/2010 1:11 PM

Page 194: Completed Academic Program Assessment Plans 20082009

Instrument for students (June 2006). The CPI used for thiscycle was version December 1997. Hard copy of documentavailable in the PT Department (copyrighted document).http://www.apta.org/AM/Template.cfm?Section=CPI1&TEMPLATE=/CM/ContentDisplay.cfm&CONTENTID=59105

Professional Behavior

Mapped to:USA- CAPTE- Standards: Element CO-1, ElementCO-2, Element CO-4

Measures & Findings

Clinical Performance Instrument ProfessionalBehaviorProgram level; Indirect - Other

Details/Description: Clinical Performance Instrument(PHYT 790 Clinical Internship) results as submitted byclinical instructors. Indicator #3.

Target: 100%

Implementation Plan (timeline): Annual

Key/Responsible Personnel: Director of ClinicalEducation

Supporting Attachments:APTA PT Clinical Performance Instrument (Web

Link)Link to APTA Physical Therapist Clinical PerformanceInstrument for students (June 2006). The CPI used for thiscycle was version December 1997. Hard copy of documentavailable in the PT Department (copyrighted document).http://www.apta.org/AM/Template.cfm?Section=CPI1&TEMPLATE=/CM/ContentDisplay.cfm&CONTENTID=59105

Findings for Clinical Performance InstrumentProfessional Behavior

Summary of Findings: All students met CPI indicator#3 at entry-level at point of graduation.

Target Achievement: Met

Recommendations :

Notes : Hard copies of individual student CPI recordsavailable upon request in the PT department.

Substantiating Evidence:

CPI Entry-Level Competence

Mapped to:USA- CAPTE- Standards: Element CO-1, ElementCO-2, Element CO-4

Measures & Findings

Clinical Performance Instrument Entry LevelCompetenceProgram level; Indirect - Other

Details/Description: Clinical Performance Instrument(PHYT 790 Clinical Internship) results as submitted byclinical instructors. Indicators #1-24.

Target: 100%

Implementation Plan (timeline): Annual

Key/Responsible Personnel: Director of ClinicalEducation

Supporting Attachments:APTA PT Clinical Performance Instrument (Web

Link)Link to APTA Physical Therapist Clinical PerformanceInstrument for students (June 2006). The CPI used for thiscycle was version December 1997. Hard copy of documentavailable in the PT Department (copyrighted document).

Findings for Clinical Performance InstrumentEntry Level Competence

Summary of Findings: All students met CIP indicators#1-24 at entry-level at point of graduation.

Target Achievement: Met

Recommendations :

Notes : Hard copies of individual student CPI recordsavailable upon request in the PT department.

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 3 8/6/2010 1:11 PM

Page 195: Completed Academic Program Assessment Plans 20082009

http://www.apta.org/AM/Template.cfm?Section=CPI1&TEMPLATE=/CM/ContentDisplay.cfm&CONTENTID=59105

Successful Course Completion

Mapped to:USA- CAPTE- Standards: Element CO-1, ElementCO-2, Element CO-4

Measures & Findings

Transcript AuditProgram level; Direct - Other

Details/Description: Transcript audit for each graduateof the program will show 100% successful completion(minimum C) of all entry-level DPT program courses.

Target: 100%

Implementation Plan (timeline): Annual graduationaudit

Key/Responsible Personnel: RegistrarDepartment Head

Supporting Attachments:

Findings for Transcript Audit

Summary of Findings: All graduates schieved aminimum of C in all entry-level DPT didactic courses.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 3 8/6/2010 1:11 PM

Page 196: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Social Work: BSW

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

BSW Program Outcomes

Outcomes

1.

Mapped to:No Mapping

Measures & Findings

SOCW 342 ( Human behavior and SocialEnvironment II)Global Perspectives RubricDetails/Description: Course Obj. #4: Apply enhancedbio-psycho-social assessment skills with families,groups, communities, and organizations. (ProgramObjectives 1, 2, 3, 4, 6, 7, 8, 9, 10)

Target: A score of 7 out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 342 ( Human behavior andSocial Environment II)Global Perspectives Rubric

Summary of Findings: The mean score for this measurewas a 7 out of a possible 10.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: TOTAL SCORECourse Obj#10: Apply and evaluate research knowledgeand skills through the development of a formal researchproposal and completion of an agency-based researchproject. (Program Objectives1, 2, 3, 4, 5, 6, 7, 8, 9, 10

Target: A score of 7 points out of 10possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #8: Design,implement, and evaluate a social action project in thecommunity and, flowing from that, create a sustainable

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

1 of 37 8/6/2010 1:12 PM

Page 197: Completed Academic Program Assessment Plans 20082009

entity to continue the work to promote economic humanrights in the community. (Program Objectives 1, 2, 3, 4,5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOcW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #3: Apply andevaluate the problem-solving process at the macro level,particularly within organizational systems. (ProgramObjectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #2: Strengthen anddeepen understanding of populations at-risk and theneed for social and economic justice. (ProgramObjectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #4: Develop and applyskills to form mutually collaborative and respectfulprofessional relationships that empower clients tobecome aware of and to utilize personal, family, group,community, organizational, and societal assets insolving individual and collective challenges. (ProgramObjectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

2 of 37 8/6/2010 1:12 PM

Page 198: Completed Academic Program Assessment Plans 20082009

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #7: Enhance andevaluate skills in verbal and written communicationthrough the use of assignments from Social Actionproject. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #1: Describe thegeneralist social work abilities needed to work in macrosettings. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SoCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #5: Developcommunity assessment and program evaluation skills.(Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SoCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #10: Enhance andevaluate skills in verbal and written communicationthrough the use of assignments from Social Actionproject. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community Experience

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

3 of 37 8/6/2010 1:12 PM

Page 199: Completed Academic Program Assessment Plans 20082009

Details/Description: Course Obj. #5: Developcommunity assessment and program evaluation skills.(Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10).

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #4: Develop and applyskills to form mutually collaborative and respectfulprofessional relationships that empower clients tobecome aware of and to utilize personal, family, group,community, organizational, and societal assets insolving individual and collective challenges. (ProgramObjectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10).

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #6: Integrate one’sown diversity sensitivity to individuals, families, groups,and communities from differing social, cultural, racial,class, age, religious backgrounds, and those withdifferent sexual orientations through participation in thecommunity visitation field work. (Program Objectives 1,2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #6: Integrate one’sown diversity sensitivity to individuals, families, groups,and communities from differing social, cultural, racial,class, age, religious backgrounds, and those withdifferent sexual orientations through participation in thecommunity visitation field work. (Program Objectives 1,2, 3, 4, 5, 6, 7, 8, 9, 10)

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

4 of 37 8/6/2010 1:12 PM

Page 200: Completed Academic Program Assessment Plans 20082009

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: Course Obj# 5: Demonstrate acommitment to ethical research practice and adherenceto human subject protection safeguards. (ProgramObjectives1, 2, 3, 4, 5, 6, 7, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #8: Apply appropriate strategiesfor analyzing, formulating, influencing, and advocatingfor desired changes at all levels of government, anddemonstrate a commitment to the principles of socialand economic justice. (Program Goals #1, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #4: Apply strategies forcontinuous self-evaluation including supervision andconsultation, and feedback from peers and other sourcesfor self development. (Program Goals #1, #4, #5)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

5 of 37 8/6/2010 1:12 PM

Page 201: Completed Academic Program Assessment Plans 20082009

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: IELD EDUCATION STUDENTLEARNING OBJECTIVE #2: Synthesize and integratevaried sources to inform decisions and create solutionsand appropriate problem-solving strategies congruentwith the social work knowledge base. (Program Goals#3, #4)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #3: Demonstrate professionalbehavior congruent with the social work Code of Ethicsand apply strategies to effectively identify, address, andresolve ethical conflicts in professional practice.(Program Goals #2, #3, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) EcomapRubricDetails/Description: (Course Obj #2: Recognize thepsychosocial needs and strengths of individuals andfamilies. (Program Objectives 1, 2, 3, 6, 8, 9, 10)

Target: A score of 7 points out of 10possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Ecomap Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) GenogramRubricDetails/Description: (Course Obj #4: Implement basicskills of engagement, interviewing, data collection and

Findings for SOCW 306 ( Social Work Practice I)Genogram Rubric

Summary of Findings: The mean score for this item was

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

6 of 37 8/6/2010 1:12 PM

Page 202: Completed Academic Program Assessment Plans 20082009

assessment, intervention and evaluation appropriatewith individuals, families, groups, organizations, andcommunities. (Program Objectives 1, 2, 3, 4, 6, 7, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) GenogramRubricDetails/Description: (Course Obj #2. Recognize thepsychosocial needs and strengths of individuals andfamilies. (Program Objectives 1, 2, 3, 6, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Genogram Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) GenogramRubricDetails/Description: (Course Obj #2: Recognize thepsychosocial needs and strengths of individuals andfamilies. (Program Objectives 1, 2, 3, 6, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Genogram Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 (Social Work Practice III) Eco mapRubricDetails/Description: (Course Obj #2. Recognize thepsychosocial needs and strengths of individuals andfamilies. (Program Objectives 1, 2, 3, 6, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 (Social Work Practice III)Eco map Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 312(Human Behavior and the SocialEnvironment) LIT REVIEW RUBRIC

Findings for SOCW 312(Human Behavior and theSocial Environment) LIT REVIEW RUBRIC

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

7 of 37 8/6/2010 1:12 PM

Page 203: Completed Academic Program Assessment Plans 20082009

Details/Description: Course Obj. #2: Demonstrateunderstanding of various theoretical explanations offorms of human diversity. (Program Objectives 1,2, 3,6,8,9) Course Obj. #7: Apply assessment skills withinthe context of human behavior and diversity across thelifespan with special emphasis given to interpretation ofvarious theoretical perspectives. (Program Objectives 1,2, 3, 4, 5, 6, 7, 8) Course Obj. #12: Evaluate varioustheoretical perspectives in terms of evidence-basedpractice and culturally competent social work practice.(Program Objectives 1, 2, 3, 4, 5, 6, 7, 8)

Target: Fall 2009

Implementation Plan (timeline): A score of 7 points outof 10 possible points.

Key/Responsible Personnel: SOCW FACULTY

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 333(Social Work Practice with SpecialPopulations) MidtermDetails/Description: Course Obj. #1: Implementbeginning interpersonal skills that are culturallycompetent. (Program Objectives 1, 2, 6, 7, 8, 9). CourseObj. #2: Develop and apply a framework forunderstanding and articulating social and culturalaspects relating to social work practice with specialpopulations. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW FACULTY

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 333(Social Work Practice withSpecial Populations) Midterm

Summary of Findings: The mean score for this item was6.

Target Achievement: Not Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 333(Social Work Practice with SpecialPopulations) MidtermCourse level; Direct - Student Artifact

Details/Description: Course Obj. #1: Implementbeginning interpersonal skills that are culturallycompetent. (Program Objectives 1, 2, 6, 7, 8, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall Semester 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 333(Social Work Practice withSpecial Populations) Midterm

Summary of Findings: The mean score for this item was6.

Target Achievement: Not Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 342 (Human Behavior and the SocialEnvironment) Rubric

Findings for SOCW 342 (Human Behavior and theSocial Environment) Rubric

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

8 of 37 8/6/2010 1:12 PM

Page 204: Completed Academic Program Assessment Plans 20082009

Details/Description: Course Obj. #4: Apply enhancedbio-psycho-social assessment skills with families,groups, communities, and organizations. (ProgramObjectives 1, 2, 3, 4, 6, 7, 8, 9, 10)

Target: A score of 7 out of 10 points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 342 (Human Behavior and the SocialEnvironment) RubricDetails/Description: Course Obj. #8: Critique theimpact of sociocultural variables on human developmentand group and family functioning. (Program Objectives 1,2, 3, 4, 6,, 8, 9)

Target: A score of 7 points out of 10 for this item.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 342 (Human Behavior and theSocial Environment) Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417 (Applied Research) RubricDetails/Description: Course Obj# 5: Demonstrate acommitment to ethical research practice and adherenceto human subject protection safeguards. (ProgramObjectives1, 2, 3, 4, 5, 6, 7, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417 (Applied Research) Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442 (Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #7: Apply a range of social worktheories and evidence-based interventions withindividuals, families, small groups, organizations, andcommunities in all types of settings. (Program Goals #1,#4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Findings for SOCW 442 (Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

9 of 37 8/6/2010 1:12 PM

Page 205: Completed Academic Program Assessment Plans 20082009

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OUTCOME #1: Apply culturally competentevidence-based practice skills adaptable to meet theneeds of individuals and groups with diversebackgrounds by utilizing understandable language andmulti-culturally sensitive communication skills. (ProgramGoal #2)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #5: Understand the forms andmechanisms of oppression and discrimination and applyinnovative social change strategies which promote bothsocial and economic justice. (Program Goals #1, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #10: Function professionallywithin an organizational system and when appropriate,effect positive change. (Program Goals #1, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

10 of 37 8/6/2010 1:12 PM

Page 206: Completed Academic Program Assessment Plans 20082009

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #6: Understand and appreciatethe history of the social work profession in the contextof contemporary social policy and practice and utilizethis knowledge to inform practice. (Program Goals #1,#4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 490 ( Senior Seminar) Eportfolio RubricDetails/Description: Cultural Competence

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 490 ( Senior Seminar)Eportfolio Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

2.

Mapped to:No Mapping

Measures & Findings

Details/Description: Course Obj#9: Demonstrate abeginning ability and skill in using software packages toanalyze statistical data. (Program Objectives 2, 4, 7, 10)

Target: A score of 7 points out of 10 for this item.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for

Summary of Findings: The mean score for this item was6.

Target Achievement: Not Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 342 ( Human behavior and SocialEnvironment II) RubricDetails/Description: Course Obj. #4: Apply enhancedbio-psycho-social assessment skills with families,groups, communities, and organizations. (ProgramObjectives 1, 2, 3, 4, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Findings for SOCW 342 ( Human behavior andSocial Environment II) Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

11 of 37 8/6/2010 1:12 PM

Page 207: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Notes :

Substantiating Evidence:

SOCW 342 ( Human behavior and SocialEnvironment II) RubricDetails/Description: Course Obj. #8: Critique theimpact of sociocultural variables on human developmentand group and family functioning. (Program Objectives 1,2, 3, 4, 6,, 8, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 342 ( Human behavior andSocial Environment II) Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #1: Describe thegeneralist social work abilities needed to work in macrosettings. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #2: Strengthen anddeepen understanding of populations at-risk and theneed for social and economic justice. (ProgramObjectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #8: Design,implement, and evaluate a social action project in thecommunity and, flowing from that, create a sustainableentity to continue the work to promote economic humanrights in the community. (Program Objectives 1, 2, 3, 4,5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

12 of 37 8/6/2010 1:12 PM

Page 208: Completed Academic Program Assessment Plans 20082009

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #10: Enhance andevaluate skills in verbal and written communicationthrough the use of assignments from Social Actionproject. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #7: Enhance andevaluate skills in verbal and written communicationthrough the use of assignments from Social Actionproject. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SoCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #5: Developcommunity assessment and program evaluation skills.(Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #3: Apply andevaluate the problem-solving process at the macro level,particularly within organizational systems. (ProgramObjectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

13 of 37 8/6/2010 1:12 PM

Page 209: Completed Academic Program Assessment Plans 20082009

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #4: Develop and applyskills to form mutually collaborative and respectfulprofessional relationships that empower clients tobecome aware of and to utilize personal, family, group,community, organizational, and societal assets insolving individual and collective challenges. (ProgramObjectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #1: Describe thegeneralist social work abilities needed to work in macrosettings. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #5: Developcommunity assessment and program evaluation skills.(Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10).

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #6: Integrate one’sown diversity sensitivity to individuals, families, groups,and communities from differing social, cultural, racial,class, age, religious backgrounds, and those withdifferent sexual orientations through participation in the

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

14 of 37 8/6/2010 1:12 PM

Page 210: Completed Academic Program Assessment Plans 20082009

community visitation field work. (Program Objectives 1,2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #4: Develop and applyskills to form mutually collaborative and respectfulprofessional relationships that empower clients tobecome aware of and to utilize personal, family, group,community, organizational, and societal assets insolving individual and collective challenges. (ProgramObjectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10).

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community Experience RubricDetails/Description: Course Obj. #6: Integrate one’sown diversity sensitivity to individuals, families, groups,and communities from differing social, cultural, racial,class, age, religious backgrounds, and those withdifferent sexual orientations through participation in thecommunity visitation field work. (Program Objectives 1,2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417 ( Applied Research) Final ResearchRubricDetails/Description: Course Obj #10: Apply APAstandards to professional writing with special attentionto using non-biased language. (Program Objectives 2, 3,6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Findings for SOCW 417 ( Applied Research) FinalResearch Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

15 of 37 8/6/2010 1:12 PM

Page 211: Completed Academic Program Assessment Plans 20082009

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Substantiating Evidence:

SOCW 417 ( Applied Research) Final ResearchRubricDetails/Description: 2. Summarizes main findings in theliterature on topic. (Course Obj #9: Objectively critiquepublished studies in the social work literature. (ProgramObjectives 2, 3, 6, 7, 8, 9, 10)Comments:

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417 ( Applied Research) FinalResearch Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417 ( Applied Research) Final ResearchRubricDetails/Description: (Course Obj #7: Describe how thescientific approach can be used to test the efficacy ofsocial interventions.) (Program Objectives 2, 3, 6, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417 ( Applied Research) FinalResearch Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417 ( Applied Research) Final ResearchRubricDetails/Description: (Course Obj #5: Apply APAstandards to professional writing with special attentionto using non-biased language.) (Program Objectives 2,3, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417 ( Applied Research) FinalResearch Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417 ( Applied Research) Final ResearchRubricDetails/Description: B. (Course Obj #2Compare andcontrast the scientific approach with other ways ofobtaining knowledge, and understand how the methodsdiffer with regard to causality and generalizability.)(Program Objectives 2, 3)

Findings for SOCW 417 ( Applied Research) FinalResearch Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

16 of 37 8/6/2010 1:12 PM

Page 212: Completed Academic Program Assessment Plans 20082009

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417 ( Applied Research) Final ResearchRubricDetails/Description: (Course Obj #3: Compare themajor research designs and discuss the strengths andweaknesses of each.) (Program Objectives 2, 3)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417 ( Applied Research) FinalResearch Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417 ( Applied Research) Final ResearchRubricDetails/Description: (Course Obj #8: Utilize APA stylesand formats when writing professional documents.)(Program Objectives 2, 3)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417 ( Applied Research) FinalResearch Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: TOTAL SCORECourse Obj#10: Apply and evaluate research knowledgeand skills through the development of a formal researchproposal and completion of an agency-based researchproject. (Program Objectives1, 2, 3, 4, 5, 6, 7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: Course Obj#7: Compare andcontrast both qualitative and quantitative researchmethodologies in social science research including thestrengths and limitations of both paradigms. (ProgramObjectives 2, 4, 5, 6, 7, 9, 10)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

17 of 37 8/6/2010 1:12 PM

Page 213: Completed Academic Program Assessment Plans 20082009

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: as specific as possible): CourseObj# 5: Demonstrate a commitment to ethical researchpractice and adherence to human subject protectionsafeguards. (Program Objectives1, 2, 3, 4, 5, 6, 7, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: Course Obj# 5: Demonstrate acommitment to ethical research practice and adherenceto human subject protection safeguards. (ProgramObjectives1, 2, 3, 4, 5, 6, 7, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: Course Obj# 5: Demonstrate acommitment to ethical research practice and adherenceto human subject protection safeguards. (ProgramObjectives1, 2, 3, 4, 5, 6, 7, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: Course Obj#6: Recognize theimpact of research on oppressed and vulnerablepopulations and the ethical and value dilemmas relatedto researching these populations. (Program Objectives1,2, 3, 4, 5, 6, 7, 9)

Target: A score of 7 points out of 10 possible points .

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

18 of 37 8/6/2010 1:12 PM

Page 214: Completed Academic Program Assessment Plans 20082009

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: (Course Obj #7: Describe how thescientific approach can be used to test the efficacy ofsocial interventions.) (Program Objectives 2, 3, 6, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: Course Obj#4: Evaluate theefficacy and significance of specific interventions used insocial work practice through the use of self and agencyevaluation. (Program Objectives 2,3, 4, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #3: Demonstrate professionalbehavior congruent with the social work Code of Ethicsand apply strategies to effectively identify, address, andresolve ethical conflicts in professional practice.(Program Goals #2, #3, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENT

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

19 of 37 8/6/2010 1:12 PM

Page 215: Completed Academic Program Assessment Plans 20082009

LEARNING OBJECTIVE #5: Understand the forms andmechanisms of oppression and discrimination and applyinnovative social change strategies which promote bothsocial and economic justice. (Program Goals #1, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #8: Apply appropriate strategiesfor analyzing, formulating, influencing, and advocatingfor desired changes at all levels of government, anddemonstrate a commitment to the principles of socialand economic justice. (Program Goals #1, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #6: Understand and appreciatethe history of the social work profession in the contextof contemporary social policy and practice and utilizethis knowledge to inform practice. (Program Goals #1,#4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #2: Synthesize and integrate

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

20 of 37 8/6/2010 1:12 PM

Page 216: Completed Academic Program Assessment Plans 20082009

varied sources to inform decisions and create solutionsand appropriate problem-solving strategies congruentwith the social work knowledge base. (Program Goals#3, #4)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OUTCOME #1: Apply culturally competentevidence-based practice skills adaptable to meet theneeds of individuals and groups with diversebackgrounds by utilizing understandable language andmulti-culturally sensitive communication skills. (ProgramGoal #2)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #7: Apply a range of social worktheories and evidence-based interventions withindividuals, families, small groups, organizations, andcommunities in all types of settings. (Program Goals #1,#4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENT

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

21 of 37 8/6/2010 1:12 PM

Page 217: Completed Academic Program Assessment Plans 20082009

LEARNING OBJECTIVE #10: Function professionallywithin an organizational system and when appropriate,effect positive change. (Program Goals #1, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) RubricDetails/Description: 1. Discussed “who you are” andhow that impacts (or will likely impact) clients in yourupcoming field practicum and future practice.Demonstrate a sense of professionalism and a positiveidentification with the social work profession. (ProgramObjectives 2, 4, 6, 9, 10)Comments:

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) RubricDetails/Description: Integrate the components anddynamics of ecological and systems perspectives asapplied to generalist social work practice. (ProgramObjectives 2, 3, 6, 7, 8, 9)

Target: A score of 7 points out of 10 possible points

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) RubricDetails/Description: Integrate social work ethics andvalues into generalist social work practice. (ProgramObjectives 1, 2, 4, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): fAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

22 of 37 8/6/2010 1:12 PM

Page 218: Completed Academic Program Assessment Plans 20082009

SOCW 306 ( Social Work Practice I) RubricDetails/Description: Analyze generalist practice throughan ecological/person-in-environment and systems theorylens. (Program Objectives 2, 3, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) EcomapRubricDetails/Description: Reflective Paper: (Course Obj #2.Recognize the psychosocial needs and strengths ofindividuals and families. (Program Objectives 1, 2, 3, 6,8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Ecomap Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) EcomapRubricDetails/Description: (11. Integrate the components anddynamics of ecological and systems perspectives asapplied to generalist social work practice. (ProgramObjectives 2, 3, 6, 7, 8, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Ecomap Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) EcomapRubricDetails/Description: 1. (Course Obj #2: Recognize thepsychosocial needs and strengths of individuals andfamilies. (Program Objectives 1, 2, 3, 6, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Ecomap Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

23 of 37 8/6/2010 1:12 PM

Page 219: Completed Academic Program Assessment Plans 20082009

SOCW 306 ( Social Work Practice I) EcomapRubricDetails/Description: Critical Thinking

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Ecomap Rubric

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) GenogramDetails/Description: (Course Obj #4: Implement basicskills of engagement, interviewing, data collection andassessment, intervention and evaluation appropriatewith individuals, families, groups, organizations, andcommunities. (Program Objectives 1, 2, 3, 4, 6, 7, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Genogram

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) GenogramDetails/Description: (Course Obj #2. Recognize thepsychosocial needs and strengths of individuals andfamilies. (Program Objectives 1, 2, 3, 6, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Genogram

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) GenogramDetails/Description: A. Genogram (Course Obj #2:Recognize the psychosocial needs and strengths ofindividuals and families. (Program Objectives 1, 2, 3, 6,8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Genogram

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

24 of 37 8/6/2010 1:12 PM

Page 220: Completed Academic Program Assessment Plans 20082009

SOCW 306 ( Social Work Practice I) GenogramDetails/Description: Course Obj#11. Integrate thecomponents and dynamics of ecological and systemsperspectives as applied to generalist social workpractice. (Program Objectives 2, 3, 6, 7, 8, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Genogram

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 312(Human Behavior adn the SocialEnvironment) LIT REVIEW RUBRICDetails/Description: Course Obj. #2: Demonstrateunderstanding of various theoretical explanations offorms of human diversity. (Program Objectives 1,2, 3,6,8,9) Course Obj. #7: Apply assessment skills withinthe context of human behavior and diversity across thelifespan with special emphasis given to interpretation ofvarious theoretical perspectives. (Program Objectives 1,2, 3, 4, 5, 6, 7, 8) Course Obj. #12: Evaluate varioustheoretical perspectives in terms of evidence-basedpractice and culturally competent social work practice.(Program Objectives 1, 2, 3, 4, 5, 6, 7, 8)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 312(Human Behavior adn theSocial Environment) LIT REVIEW RUBRIC

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 312(Human Behavior adn the SocialEnvironment) LIT REVIEW RUBRICDetails/Description: Course Obj#8: Apply enhancedanalytical and critical thinking skills through thedevelopment of an annotated bibliography and literaturereview. (Program Objectives 2, 3, 8, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: Socw Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 312(Human Behavior adn theSocial Environment) LIT REVIEW RUBRIC

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 333(Social Work Practice with SpecialPopulations) MidtermDetails/Description: #1: Implement beginninginterpersonal skills that are culturally competent.(Program Objectives 1, 2, 6, 7, 8, 9)

Findings for SOCW 333(Social Work Practice withSpecial Populations) Midterm

Summary of Findings: The mean score for this item was6.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

25 of 37 8/6/2010 1:12 PM

Page 221: Completed Academic Program Assessment Plans 20082009

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Target Achievement: Not Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 333(Social Work Practice with SpecialPopulations) MidtermDetails/Description: Course Obj. #1: Implementbeginning interpersonal skills that are culturallycompetent. (Program Objectives 1, 2, 6, 7, 8, 9). CourseObj. #2: Develop and apply a framework forunderstanding and articulating social and culturalaspects relating to social work practice with specialpopulations. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline):

Key/Responsible Personnel: SOCW all faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 333(Social Work Practice withSpecial Populations) Midterm

Summary of Findings: The mean score for this item was6.

Target Achievement: Not Met

Recommendations :

Notes :

Substantiating Evidence:

4.

Mapped to:No Mapping

Measures & Findings

SOCW 342 ( Human behavior and SocialEnvironment II) RubricDetails/Description: Course Obj. #8: Critique theimpact of sociocultural variables on human developmentand group and family functioning. (Program Objectives 1,2, 3, 4, 6,, 8, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 342 ( Human behavior andSocial Environment II) Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 342 ( Human behavior and SocialEnvironment II) RubricDetails/Description: Course Obj. #4: Apply enhancedbio-psycho-social assessment skills with families,groups, communities, and organizations. (ProgramObjectives 1, 2, 3, 4, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Findings for SOCW 342 ( Human behavior andSocial Environment II) Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

26 of 37 8/6/2010 1:12 PM

Page 222: Completed Academic Program Assessment Plans 20082009

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #4: Develop and applyskills to form mutually collaborative and respectfulprofessional relationships that empower clients tobecome aware of and to utilize personal, family, group,community, organizational, and societal assets insolving individual and collective challenges. (ProgramObjectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #8: Design,implement, and evaluate a social action project in thecommunity and, flowing from that, create a sustainableentity to continue the work to promote economic humanrights in the community. (Program Objectives 1, 2, 3, 4,5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SoCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #3: Apply andevaluate the problem-solving process at the macro level,particularly within organizational systems. (ProgramObjectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #2: Strengthen anddeepen understanding of populations at-risk and theneed for social and economic justice. (ProgramObjectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

27 of 37 8/6/2010 1:12 PM

Page 223: Completed Academic Program Assessment Plans 20082009

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #1: Describe thegeneralist social work abilities needed to work in macrosettings. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #5: Developcommunity assessment and program evaluation skills.(Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #7: Enhance andevaluate skills in verbal and written communicationthrough the use of assignments from Social Actionproject. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SoCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) RubricDetails/Description: Course Obj. #5: Developcommunity assessment and program evaluation skills.(Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:

Findings for SOCW 407( Social Work Practice III)Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

28 of 37 8/6/2010 1:12 PM

Page 224: Completed Academic Program Assessment Plans 20082009

ProgramGoalsObj (Microsoft Word) Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #5: Developcommunity assessment and program evaluation skills.(Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10).

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #4: Develop and applyskills to form mutually collaborative and respectfulprofessional relationships that empower clients tobecome aware of and to utilize personal, family, group,community, organizational, and societal assets insolving individual and collective challenges. (ProgramObjectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10).

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #6: Integrate one’sown diversity sensitivity to individuals, families, groups,and communities from differing social, cultural, racial,class, age, religious backgrounds, and those withdifferent sexual orientations through participation in thecommunity visitation field work. (Program Objectives 1,2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #6: Integrate one’sown diversity sensitivity to individuals, families, groups,

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

29 of 37 8/6/2010 1:12 PM

Page 225: Completed Academic Program Assessment Plans 20082009

and communities from differing social, cultural, racial,class, age, religious backgrounds, and those withdifferent sexual orientations through participation in thecommunity visitation field work. (Program Objectives 1,2, 3, 4, 5, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #1: Describe thegeneralist social work abilities needed to work in macrosettings. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 407( Social Work Practice III) Rubric-Community ExperienceDetails/Description: Course Obj. #1: Describe thegeneralist social work abilities needed to work in macrosettings. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 407( Social Work Practice III)Rubric-Community Experience

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: TOTAL SCORECourse Obj#10: Apply and evaluate research knowledgeand skills through the development of a formal researchproposal and completion of an agency-based researchproject. (Program Objectives1, 2, 3, 4, 5, 6, 7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

30 of 37 8/6/2010 1:12 PM

Page 226: Completed Academic Program Assessment Plans 20082009

SOCW 417(Applied Research )RubricDetails/Description: Course Obj#9: Demonstrate abeginning ability and skill in using software packages toanalyze statistical data. (Program Objectives 2, 4, 7, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was6.

Target Achievement: Not Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: Course Obj# 5: Demonstrate acommitment to ethical research practice and adherenceto human subject protection safeguards. (ProgramObjectives1, 2, 3, 4, 5, 6, 7, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: Course Obj#6: Recognize theimpact of research on oppressed and vulnerablepopulations and the ethical and value dilemmas relatedto researching these populations. (Program Objectives1,2, 3, 4, 5, 6, 7, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: Course Obj# 5: Demonstrate acommitment to ethical research practice and adherenceto human subject protection safeguards. (ProgramObjectives1, 2, 3, 4, 5, 6, 7, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

31 of 37 8/6/2010 1:12 PM

Page 227: Completed Academic Program Assessment Plans 20082009

SOCW 417(Applied Research )RubricDetails/Description: (attach annotated bibliography ofreferences): Course Obj#4: Evaluate the efficacy andsignificance of specific interventions used in social workpractice through the use of self and agency evaluation.(Program Objectives 2,3, 4, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: Socw faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 417(Applied Research )RubricDetails/Description: Course Obj#7: Compare andcontrast both qualitative and quantitative researchmethodologies in social science research including thestrengths and limitations of both paradigms. (ProgramObjectives 2, 4, 5, 6, 7, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW Faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 417(Applied Research )Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #3: Demonstrate professionalbehavior congruent with the social work Code of Ethicsand apply strategies to effectively identify, address, andresolve ethical conflicts in professional practice.(Program Goals #2, #3, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #6: Understand and appreciatethe history of the social work profession in the contextof contemporary social policy and practice and utilizethis knowledge to inform practice. (Program Goals #1,#4, #5)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

32 of 37 8/6/2010 1:12 PM

Page 228: Completed Academic Program Assessment Plans 20082009

LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #4: Apply strategies forcontinuous self-evaluation including supervision andconsultation, and feedback from peers and other sourcesfor self development. (Program Goals #1, #4, #5)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OUTCOME #1: Apply culturally competentevidence-based practice skills adaptable to meet theneeds of individuals and groups with diversebackgrounds by utilizing understandable language andmulti-culturally sensitive communication skills. (ProgramGoal #2)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #8: Apply appropriate strategiesfor analyzing, formulating, influencing, and advocatingfor desired changes at all levels of government, anddemonstrate a commitment to the principles of socialand economic justice. (Program Goals #1, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

33 of 37 8/6/2010 1:12 PM

Page 229: Completed Academic Program Assessment Plans 20082009

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #5: Understand the forms andmechanisms of oppression and discrimination and applyinnovative social change strategies which promote bothsocial and economic justice. (Program Goals #1, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #9: Critically evaluate andanalyze the effectiveness of evidence-basedinterventions with individuals, families, small groups,organizations, and communities, using a variety ofqualitative and quantitative methods. (Program Goals#1, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442( Field Education II) Field LearningOutcomeDetails/Description: IELD EDUCATION STUDENTLEARNING OBJECTIVE #2: Synthesize and integratevaried sources to inform decisions and create solutionsand appropriate problem-solving strategies congruentwith the social work knowledge base. (Program Goals#3, #4)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Findings for SOCW 442( Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was9.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

34 of 37 8/6/2010 1:12 PM

Page 230: Completed Academic Program Assessment Plans 20082009

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

SOCW 306 ( Social Work Practice I) RubricDetails/Description: Integrate social work ethics andvalues into generalist social work practice. (ProgramObjectives 1, 2, 4, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) RubricDetails/Description: Integrate social work ethics andvalues into generalist social work practice. (ProgramObjectives 1, 2, 4, 6, 7, 8, 9, 10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) RubricDetails/Description: 1. Discussed “who you are” andhow that impacts (or will likely impact) clients in yourupcoming field practicum and future practice.Demonstrate a sense of professionalism and a positiveidentification with the social work profession. (ProgramObjectives 2, 4, 6, 9, 10)Comments:

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 306 ( Social Work Practice I)Rubric

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 306 ( Social Work Practice I) GenogramDetails/Description: (Course Obj #4: Implement basicskills of engagement, interviewing, data collection andassessment, intervention and evaluation appropriatewith individuals, families, groups, organizations, andcommunities. (Program Objectives 1, 2, 3, 4, 6, 7, 8, 9,

Findings for SOCW 306 ( Social Work Practice I)Genogram

Summary of Findings: The mean score for this item was7.

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

35 of 37 8/6/2010 1:12 PM

Page 231: Completed Academic Program Assessment Plans 20082009

10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 312 LIT REVIEW RUBRICDetails/Description: Course Obj. #2: Demonstrateunderstanding of various theoretical explanations offorms of human diversity. (Program Objectives 1,2, 3,6,8,9) Course Obj. #7: Apply assessment skills withinthe context of human behavior and diversity across thelifespan with special emphasis given to interpretation ofvarious theoretical perspectives. (Program Objectives 1,2, 3, 4, 5, 6, 7, 8) Course Obj. #12: Evaluate varioustheoretical perspectives in terms of evidence-basedpractice and culturally competent social work practice.(Program Objectives 1, 2, 3, 4, 5, 6, 7, 8)

Target: A score of 7 points out of 10 for this item.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: Socw all faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 312 LIT REVIEW RUBRIC

Summary of Findings: The mean score for this item was6.

Target Achievement: Not Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 333(Social Work Practice with SpecialPopulations) MidtermDetails/Description: Course Obj. #1: Implementbeginning interpersonal skills that are culturallycompetent. (Program Objectives 1, 2, 6, 7, 8, 9). CourseObj. #2: Develop and apply a framework forunderstanding and articulating social and culturalaspects relating to social work practice with specialpopulations. (Program Objectives 1, 2, 3, 4, 5, 6, 7, 8, 9,10)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 333(Social Work Practice withSpecial Populations) Midterm

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

SOCW 333(Social Work Practice with SpecialPopulations) MidtermDetails/Description: #1: Implement beginninginterpersonal skills that are culturally competent.(Program Objectives 1, 2, 6, 7, 8, 9)

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Findings for SOCW 333(Social Work Practice withSpecial Populations) Midterm

Summary of Findings: The mean score for this item was7.

Target Achievement: Met

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

36 of 37 8/6/2010 1:12 PM

Page 232: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442 (Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #7: Apply a range of social worktheories and evidence-based interventions withindividuals, families, small groups, organizations, andcommunities in all types of settings. (Program Goals #1,#4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10 possible points.

Implementation Plan (timeline): FAll 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442 (Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

SOCW 442 (Field Education II) Field LearningOutcomeDetails/Description: FIELD EDUCATION STUDENTLEARNING OBJECTIVE #10: Function professionallywithin an organizational system and when appropriate,effect positive change. (Program Goals #1, #4, #5)LINKED WITH PROGRAM OBJECTIVES # 1, 2, 3, 4. 5, 6,7, 8, 9, 10

Target: A score of 7 points out of 10possible points.

Implementation Plan (timeline): Fall 2009

Key/Responsible Personnel: SOCW faculty

Supporting Attachments:ProgramGoalsObj (Microsoft Word)

Findings for SOCW 442 (Field Education II) FieldLearning Outcome

Summary of Findings: The mean score for this item was8.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome Assessment Details http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp?qy...

37 of 37 8/6/2010 1:12 PM

Page 233: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Engineering: BSE

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Engineering

Outcomes

Outcome 1

Mapped to:

USA- ABET-Criteria for Accrediting Engineering Programs: Element (a), Element (k)

Measures & Findings

College Assessment of Academic Proficiency (CAAP) Program level; Direct - Exam

Details/Description: College Assessment of Academic Proficiency (CAAP) is a standardized test that covers 5 sections – writing, reading, mathematics, science, and critical thinking. The completion of a section of the CAAP is a University requirement for graduation. The engineering program uses CAAP results to evaluate (1) proficiency in basic technical subjects that support engineering and (2) broad knowledge of subjects outside of engineering, such as social studies. Results in the “Math” and “Science” categories are used to assess Outcome 1.

Target: Graduating Seniors from the Engineering Programs Score should be greater than or equal to the 60th percentile of the national scores.

Implementation Plan (timeline): Every Semester

Key/Responsible Personnel: Student, Administrative Assistant

Supporting Attachments:

Findings for College Assessment of Academic Proficiency (CAAP) Summary of Findings: The CAAP results for UTC engineering students graduating during the 2008-2009 academic year are: Writing: UTC mean = 64.67 (n= 9) which is 58.89 national percentile Mathematics: UTC mean = 63.28 (n= 7) which is 89.15 national percentile Science: UTC mean = 65.08 (n=12) which is 74.33 national percentile Reading: UTC mean = 61 (n=9) which is 42.33 national percentile Critical Thinking: UTC mean = 65.75 (n=12) which is 71.67 national percentile The goals for Writing and Reading were not met during this academic year. The writing score is close to meeting its goal; however the reading score is much lower than desired. In addition, the average UTC Writing score as a national percentile has increased over the last four years. The UTC reading score as a national percentile has decreased significant in the last year.

Target Achievement: Not Met

Recommendations : The 2009-2010 scores for both writing and reading will be assessed closely to see if there is a negative trend.

Notes :

Substantiating Evidence:

Course Assessments Course level; Direct - Other

Details/Description: A key tool in the assessment of program outcomes is the evaluation of courses within the

Findings for Course Assessments Summary of Findings: ENGR 222, 270, 303, and 307 are assessed for this outcome.

Page 1 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 234: Completed Academic Program Assessment Plans 20082009

curriculum. Learning objectives have been established for each course, and these objectives have been mapped to corresponding program outcomes. Upon completion of a course, the instructor assembles a course folder. If the course is involved in the assessment process, the instructor completes a course assessment form similar to that shown in Figure 3. This form is used to score each learning objective based on the level of achievement demonstrated in student work contained in the course folder. The scoring, which ranges from 1 to 3, indicates whether the performance of the assessed students reflects an expected level of achievement (score of 2), a higher than expected level of achievement (score of 3), or a lower than expected level of achievement (score of 1). Two to three additional reviewers from the engineering faculty also complete assessment forms. The scores from the three to four assessments are averaged and recorded on the first page of the form. This gives a course-specific mean score for each program outcome supported. The instructor is required to examine the completed forms and make recommendations for course modifications to address any deficiencies. The next time the course is taught, the course instructor examines the prior assessment form to see what changes to the course were suggested. Significant changes are brought to the attention of the engineering faculty. Action is taken on an as-needed basis. Target: Courses that introduce and emphasize program objectives. Outcome assessment average should be greater than or equal to 2.0. Implementation Plan (timeline): Course assessments are completed in a two year cycle with half the ENGR courses being evaluated in the first year and the other half being evaluated in the second year.

Key/Responsible Personnel: Wigal and all CECS faculty

Supporting Attachments:

ENGR 222: ENGR 270: ENGR 303: ENGR 307: The course assessments average a ?? out of 3.0.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

FE Exam Program level; Direct - Exam

Details/Description: The Fundamentals of Engineering (FE) examination is a nationally administered test developed by the

Findings for FE Exam Summary of Findings: The 2008-2009 UTC FE Average % Correct Scores as a Percent of National % Correct Scores are

Page 2 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 235: Completed Academic Program Assessment Plans 20082009

National Council of Examiners for Engineering and Surveying (NCEES). The FE exam covers subject matter taught in a typical baccalaureate engineering program and includes the following range of subjects: - Chemistry - Computers - Electricity and Magnetism - Engineering Economics - Engineering Mechanics - Engineering Probability and Statistics - Ethics & Business Practices - Fluid Mechanics - Material Properties - Mathematics - Strength of Materials - Thermodynamics as well as more specific topics related to various engineering disciplines. The FE categories considered for Objective 1 are Chemistry, Computers, Engineering Mechanics, Material Properties, Math, and Engineering Probability and Statistics. The FE exam results are obtained from the Tennessee State Board of Architectural and Engineering Examiners in Nashville following each fall and spring test. All engineering students at UTC are strongly encouraged to take the FE exam toward the end of their senior year. Some students take the general test in both the morning and afternoon sessions while others elect to take the subject test in the afternoon. An overall score for each outcome is formed by averaging the scores for the categories related to that outcome. The metric goals established for the FE exam are that the overall score for UTC engineering students for each outcome will be at least 95 percent of the corresponding national average score. This goal was chosen because we have a commitment to produced students who perform at least as well as the average national engineering population of graduates.

Target: Graduating Engineering Students from the various engineering programs 95% of National average per each subject area

Implementation Plan (timeline): Every Semester

Key/Responsible Personnel: Wigal (UG Assistant Dean) and McDonald

Supporting Attachments:

Mathematics: 95.06% ENGR Probability and Statistics: 103.58% Chemisry: 98.35% Computers: 94.09% Engineering Mechanics: 115,33% Strength of Materials: 109.79% The area of computer applications is slightly lower than the projected goal of 95%.

Target Achievement: Not Met

Recommendations : The faculty will review the FE questions being asked under the computer applications area to ensure clear understanding of the subject requirements. Then the faculty will identify where in the curriculum the topics are covered/addressed and ensure they are sufficient. In addition the 2009 - 2010 data will be reviewed to see if the % downward is a trend or just a minor "blip".

Notes : Target is met for all except one subject area.

Substantiating Evidence:

Page 3 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 236: Completed Academic Program Assessment Plans 20082009

Outcome 2

Mapped to:

USA- ABET-Criteria for Accrediting Engineering Programs: Element (e), Element (k)

Measures & Findings

FE Exam Program level; Direct - Exam

Details/Description: The Fundamentals of Engineering (FE) examination is a nationally administered test developed by the National Council of Examiners for Engineering and Surveying (NCEES). The FE exam covers subject matter taught in a typical baccalaureate engineering program and includes the following range of subjects: - Chemistry - Computers - Electricity and Magnetism - Engineering Economics - Engineering Mechanics - Engineering Probability and Statistics - Ethics & Business Practices - Fluid Mechanics - Material Properties - Mathematics - Strength of Materials - Thermodynamics as well as more specific topics related to various engineering disciplines. The FE categories considered for Objective 2 are Electricity and Magnetism, Engineering Economics, Thermodynamics, Fluid Mechanics, Strength of Materials, and specialty topics. The FE exam results are obtained from the Tennessee State Board of Architectural and Engineering Examiners in Nashville following each fall and spring test. All engineering students at UTC are strongly encouraged to take the FE exam toward the end of their senior year. Some students take the general test in both the morning and afternoon sessions while others elect to take the subject test in the afternoon. An overall score for each outcome is formed by averaging the scores for the categories related to that outcome. The metric goals established for the FE exam are that the overall score for UTC engineering students for each outcome will be at least 95 percent of the corresponding national average score. This goal was chosen because we have a commitment to produced students who perform at least as well as the average national engineering population of graduates.

Target: Graduating Engineering

Findings for FE Exam Summary of Findings: The FE Exam assessment is based on the morning general exam on the following topics: electricity & magnetism, engineering economics, fluid mechanics, material properties, and thermodynamics. The level of achievement is provided below: ENGR Economics: 100.1% Material Properties: 104.35% Fluid Mechanics: 112.04% Elect & Magnetism: 108.16% Thermodynamics: 108.63% The UTC Engineering students exceed target achievements in all subject areas.

Target Achievement: Exceeded

Recommendations : None

Notes :

Substantiating Evidence:

Page 4 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 237: Completed Academic Program Assessment Plans 20082009

Students from the various engineering programs 95% of National average per each subject area

Implementation Plan (timeline): Every Semester

Key/Responsible Personnel: Wigal (UG Assistant Dean) and McDonald

Supporting Attachments:

Graduating Senior Survey (EBI) Program level; Indirect - Survey

Details/Description: Educational Benchmark, Inc. (EBI) produces a standardized, national survey that is administered to students at engineering schools throughout the United States. The survey contains a number of questions relevant to the seven ENGR program outcomes. A member of the Engineering faculty analyzes and interprets the EBI provided data and reports the results. The responses from all engineering students are reviewed for the ENGR program since the seven outcomes being assessed are outcomes of all engineering programs and most of the knowledge and skills associated with the outcomes are developed in the core ENGR courses. The survey was given to UTC engineering students in 2005 & 2007. The 2005 data is for all engineering graduating seniors for 2003-2004 and 2004-2005 (2 academic years). The 2007 data is for all engineering graduating seniors for 2005-2006 and 2006-2007 (2 academic years). UTC uses this survey to aid in evaluating ENGR program outcomes 2 through 7. The questions that relate to each outcome are identified below. Outcome 2: (2003: 43,44,66) (2005: 41,44,53,54,69) (2007: 41,42,43) Outcome 3: (2003: 38,39,40,50) (2005: 36,37,38) (2007:36,37,38,50) Outcome 4: (2003: 53-63) (2005: 39,47) (2007: 39,60-63,65-67) Outcome 5: (2003: 17,42,48,49) (2005: 15,40,45,46) (2007: 15,40,46,47) Outcome 6: (2003: 45) (2005: 42,49) (2007: 44,48,64) Outcome 7: (2003: 46) (2005: 43,55) (2007: 49,57) An overall score for each outcome is formed by averaging the scores for

No Findings Added to Graduating Senior Survey (EBI)

Page 5 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 238: Completed Academic Program Assessment Plans 20082009

the questions related to that outcome.

Target: All UTC Graduating Engineering Students The metric goals established for the EBI survey are that the overall UTC score for each outcome will exceed the overall average score for all participating institutions reported by EBI Implementation Plan (timeline): Graduating seniors complete the EBI survey at the end of the fall and spring semester every year. Every other year the collection of surveys (from the past 2 years) are provided to EBI. The survey is scored by EBI. EBI provides UTC with the UTC specific scores (overall and major) as well as scores collected from other participating schools.

Key/Responsible Personnel: CECS Department Heads, UG Assistant Dean

Supporting Attachments:

Outcome 3

Mapped to:

USA- ABET-Criteria for Accrediting Engineering Programs: Element (b), Element (k)

Measures & Findings

Course Assessments Course level; Direct - Other

Details/Description: A key tool in the assessment of program outcomes is the evaluation of courses within the curriculum. Learning objectives have been established for each course, and these objectives have been mapped to corresponding program outcomes. Upon completion of a course, the instructor assembles a course folder. If the course is involved in the assessment process, the instructor completes a course assessment form similar to that shown in Figure 3. This form is used to score each learning objective based on the level of achievement demonstrated in student work contained in the course folder. The scoring, which ranges from 1 to 3, indicates whether the performance of the assessed students reflects an expected level of achievement (score of 2), a higher than expected level of achievement (score of 3), or a lower than expected level of achievement (score of 1). Two to three additional reviewers from the engineering faculty also complete assessment forms. The scores from the three to four assessments are averaged and recorded on the first page of the form. This gives a course-specific mean score for each program outcome supported. The instructor is required to examine

No Findings Added to Course Assessments

Page 6 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 239: Completed Academic Program Assessment Plans 20082009

the completed forms and make recommendations for course modifications to address any deficiencies. The next time the course is taught, the course instructor examines the prior assessment form to see what changes to the course were suggested. Significant changes are brought to the attention of the engineering faculty. Action is taken on an as-needed basis. Target: Courses that introduce and emphasize program objectives. Objective assessment average should be greater than or equal to 2.0. Implementation Plan (timeline): Course assessments are completed in a two year cycle with half the ENGR courses being evaluated in the first year and the other half being evaluated in the second year.

Key/Responsible Personnel: Wigal and all CECS faculty

Supporting Attachments:

Graduating Senior Survey (EBI) Program level; Indirect - Survey

Details/Description: Educational Benchmark, Inc. (EBI) produces a standardized, national survey that is administered to students at engineering schools throughout the United States. The survey contains a number of questions relevant to the seven ENGR program outcomes. A member of the Engineering faculty analyzes and interprets the EBI provided data and reports the results. The responses from all engineering students are reviewed for the ENGR program since the seven outcomes being assessed are outcomes of all engineering programs and most of the knowledge and skills associated with the outcomes are developed in the core ENGR courses. UTC uses this survey to aid in evaluating ENGR program outcomes 2 through 7. An overall score for each outcome is formed by averaging the scores for the questions related to that outcome.

Target: All UTC Graduating Engineering Students The metric goals established for the EBI survey are that the overall UTC score for each outcome will exceed the overall average score for all

No Findings Added to Graduating Senior Survey (EBI)

Page 7 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 240: Completed Academic Program Assessment Plans 20082009

participating institutions reported by EBI Implementation Plan (timeline): Graduating seniors complete the EBI survey at the end of the fall and spring semester every year. Every other year the collection of surveys (from the past 2 years) are provided to EBI. The survey is scored by EBI. EBI provides UTC with the UTC specific scores (overall and major) as well as scores collected from other participating schools.

Key/Responsible Personnel: CECS Department Heads, UG Assistant Dean

Supporting Attachments:

Outcome 4

Mapped to:

USA- ABET-Criteria for Accrediting Engineering Programs: Element (c)

Measures & Findings

Course Assessments Course level; Direct - Other

Details/Description: A key tool in the assessment of program outcomes is the evaluation of courses within the curriculum. Learning objectives have been established for each course, and these objectives have been mapped to corresponding program outcomes. Upon completion of a course, the instructor assembles a course folder. If the course is involved in the assessment process, the instructor completes a course assessment form similar to that shown in Figure 3. This form is used to score each learning objective based on the level of achievement demonstrated in student work contained in the course folder. The scoring, which ranges from 1 to 3, indicates whether the performance of the assessed students reflects an expected level of achievement (score of 2), a higher than expected level of achievement (score of 3), or a lower than expected level of achievement (score of 1). Two to three additional reviewers from the engineering faculty also complete assessment forms. The scores from the three to four assessments are averaged and recorded on the first page of the form. This gives a course-specific mean score for each program outcome supported. The instructor is required to examine the completed forms and make recommendations for course modifications to address any deficiencies. The next time the course is taught, the course instructor examines the prior assessment form to see what changes to the course were suggested. Significant changes

No Findings Added to Course Assessments

Page 8 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 241: Completed Academic Program Assessment Plans 20082009

are brought to the attention of the engineering faculty. Action is taken on an as-needed basis. Target: Courses that introduce and emphasize program objectives. Objective assessment average should be greater than or equal to 2.0.

Implementation Plan (timeline): Course assessments are completed in a two year cycle with half the ENGR courses being evaluated in the first year and the other half being evaluated in the second year.

Key/Responsible Personnel: Wigal and all CECS faculty

Supporting Attachments:

Graduating Senior Survey (EBI) Program level; Indirect - Survey

Details/Description: Educational Benchmark, Inc. (EBI) produces a standardized, national survey that is administered to students at engineering schools throughout the United States. The survey contains a number of questions relevant to the seven ENGR program outcomes. A member of the Engineering faculty analyzes and interprets the EBI provided data and reports the results. The responses from all engineering students are reviewed for the ENGR program since the seven outcomes being assessed are outcomes of all engineering programs and most of the knowledge and skills associated with the outcomes are developed in the core ENGR courses. UTC uses this survey to aid in evaluating ENGR program outcomes 2 through 7. An overall score for each outcome is formed by averaging the scores for the questions related to that outcome.

Target: All UTC Graduating Engineering Students The metric goals established for the EBI survey are that the overall UTC score for each outcome will exceed the overall average score for all participating institutions reported by EBI Implementation Plan (timeline): Graduating seniors complete the EBI survey at the end of the fall and spring semester every year. Every other year the collection of surveys (from the past 2 years) are provided to EBI. The

No Findings Added to Graduating Senior Survey (EBI)

Page 9 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 242: Completed Academic Program Assessment Plans 20082009

survey is scored by EBI. EBI provides UTC with the UTC specific scores (overall and major) as well as scores collected from other participating schools.

Key/Responsible Personnel: CECS Department Heads, UG Assistant Dean

Supporting Attachments:

Outcome 5

Mapped to:

USA- ABET-Criteria for Accrediting Engineering Programs: Element (d), Element (g)

Measures & Findings

Course Assessments Course level; Direct - Other

Details/Description: A key tool in the assessment of program outcomes is the evaluation of courses within the curriculum. Learning objectives have been established for each course, and these objectives have been mapped to corresponding program outcomes. Upon completion of a course, the instructor assembles a course folder. If the course is involved in the assessment process, the instructor completes a course assessment form similar to that shown in Figure 3. This form is used to score each learning objective based on the level of achievement demonstrated in student work contained in the course folder. The scoring, which ranges from 1 to 3, indicates whether the performance of the assessed students reflects an expected level of achievement (score of 2), a higher than expected level of achievement (score of 3), or a lower than expected level of achievement (score of 1). Two to three additional reviewers from the engineering faculty also complete assessment forms. The scores from the three to four assessments are averaged and recorded on the first page of the form. This gives a course-specific mean score for each program outcome supported. The instructor is required to examine the completed forms and make recommendations for course modifications to address any deficiencies. The next time the course is taught, the course instructor examines the prior assessment form to see what changes to the course were suggested. Significant changes are brought to the attention of the engineering faculty. Action is taken on an as-needed basis. Target: Courses that introduce and emphasize program objectives. Objective assessment average should

No Findings Added to Course Assessments

Page 10 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 243: Completed Academic Program Assessment Plans 20082009

be greater than or equal to 2.0.

Implementation Plan (timeline): Course assessments are completed in a two year cycle with half the ENGR courses being evaluated in the first year and the other half being evaluated in the second year.

Key/Responsible Personnel: Wigal and all CECS faculty

Supporting Attachments:

Graduating Senior Survey (EBI) Program level; Indirect - Survey

Details/Description: Educational Benchmark, Inc. (EBI) produces a standardized, national survey that is administered to students at engineering schools throughout the United States. The survey contains a number of questions relevant to the seven ENGR program outcomes. A member of the Engineering faculty analyzes and interprets the EBI provided data and reports the results. The responses from all engineering students are reviewed for the ENGR program since the seven outcomes being assessed are outcomes of all engineering programs and most of the knowledge and skills associated with the outcomes are developed in the core ENGR courses. UTC uses this survey to aid in evaluating ENGR program outcomes 2 through 7. An overall score for each outcome is formed by averaging the scores for the questions related to that outcome.

Target: All UTC Graduating Engineering Students The metric goals established for the EBI survey are that the overall UTC score for each outcome will exceed the overall average score for all participating institutions reported by EBI Implementation Plan (timeline): Graduating seniors complete the EBI survey at the end of the fall and spring semester every year. Every other year the collection of surveys (from the past 2 years) are provided to EBI. The survey is scored by EBI. EBI provides UTC with the UTC specific scores (overall and major) as well as scores collected from other participating schools.

Key/Responsible Personnel: CECS Department Heads, UG Assistant Dean

Supporting Attachments:

No Findings Added to Graduating Senior Survey (EBI)

Page 11 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 244: Completed Academic Program Assessment Plans 20082009

Outcome 6

Mapped to:

USA- ABET-Criteria for Accrediting Engineering Programs: Element (f), Element (i)

Measures & Findings

Course Assessments Course level; Direct - Other

Details/Description: A key tool in the assessment of program outcomes is the evaluation of courses within the curriculum. Learning objectives have been established for each course, and these objectives have been mapped to corresponding program outcomes. Upon completion of a course, the instructor assembles a course folder. If the course is involved in the assessment process, the instructor completes a course assessment form similar to that shown in Figure 3. This form is used to score each learning objective based on the level of achievement demonstrated in student work contained in the course folder. The scoring, which ranges from 1 to 3, indicates whether the performance of the assessed students reflects an expected level of achievement (score of 2), a higher than expected level of achievement (score of 3), or a lower than expected level of achievement (score of 1). Two to three additional reviewers from the engineering faculty also complete assessment forms. The scores from the three to four assessments are averaged and recorded on the first page of the form. This gives a course-specific mean score for each program outcome supported. The instructor is required to examine the completed forms and make recommendations for course modifications to address any deficiencies. The next time the course is taught, the course instructor examines the prior assessment form to see what changes to the course were suggested. Significant changes are brought to the attention of the engineering faculty. Action is taken on an as-needed basis. Target: Courses that introduce and emphasize program objectives. Objective assessment average should be greater than or equal to 2.0.

Implementation Plan (timeline): Course assessments are completed in a two year cycle with half the ENGR courses being evaluated in the first year and the other half being evaluated in the second year.

Key/Responsible Personnel: Wigal and all CECS faculty

No Findings Added to Course Assessments

Page 12 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 245: Completed Academic Program Assessment Plans 20082009

Supporting Attachments:

FE Exam Program level; Direct - Exam

Details/Description: The Fundamentals of Engineering (FE) examination is a nationally administered test developed by the National Council of Examiners for Engineering and Surveying (NCEES). The FE exam covers subject matter taught in a typical baccalaureate engineering program and includes the following range of subjects: - Chemistry - Computers - Electricity and Magnetism - Engineering Economics - Engineering Mechanics - Engineering Probability and Statistics - Ethics & Business Practices - Fluid Mechanics - Material Properties - Mathematics - Strength of Materials - Thermodynamics as well as more specific topics related to various engineering disciplines. The FE categories considered for Objective 6 are Ethics & Business Practices. The FE exam results are obtained from the Tennessee State Board of Architectural and Engineering Examiners in Nashville following each fall and spring test. All engineering students at UTC are strongly encouraged to take the FE exam toward the end of their senior year. Some students take the general test in both the morning and afternoon sessions while others elect to take the subject test in the afternoon. An overall score for each outcome is formed by averaging the scores for the categories related to that outcome. The metric goals established for the FE exam are that the overall score for UTC engineering students for each outcome will be at least 95 percent of the corresponding national average score. This goal was chosen because we have a commitment to produced students who perform at least as well as the average national engineering population of graduates.

Target: Graduating Engineering Students from the various engineering programs 95% of National average per each subject area

Findings for FE Exam Summary of Findings: UTC: 82.96% Nat'l: 74.67% UTC % Correct vs Nat'l % Correct: 111.1%

Target Achievement: Exceeded

Recommendations : None

Notes :

Substantiating Evidence:

Page 13 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 246: Completed Academic Program Assessment Plans 20082009

Implementation Plan (timeline): Every Semester

Key/Responsible Personnel: Wigal (UG Assistant Dean) and McDonald

Supporting Attachments:

Graduating Senior Survey (EBI) Program level; Indirect - Survey

Details/Description: Educational Benchmark, Inc. (EBI) produces a standardized, national survey that is administered to students at engineering schools throughout the United States. The survey contains a number of questions relevant to the seven ENGR program outcomes. A member of the Engineering faculty analyzes and interprets the EBI provided data and reports the results. The responses from all engineering students are reviewed for the ENGR program since the seven outcomes being assessed are outcomes of all engineering programs and most of the knowledge and skills associated with the outcomes are developed in the core ENGR courses. UTC uses this survey to aid in evaluating ENGR program outcomes 2 through 7. An overall score for each outcome is formed by averaging the scores for the questions related to that outcome.

Target: All UTC Graduating Engineering Students The metric goals established for the EBI survey are that the overall UTC score for each outcome will exceed the overall average score for all participating institutions reported by EBI Implementation Plan (timeline): Graduating seniors complete the EBI survey at the end of the fall and spring semester every year. Every other year the collection of surveys (from the past 2 years) are provided to EBI. The survey is scored by EBI. EBI provides UTC with the UTC specific scores (overall and major) as well as scores collected from other participating schools.

Key/Responsible Personnel: CECS Department Heads, UG Assistant Dean

Supporting Attachments:

No Findings Added to Graduating Senior Survey (EBI)

Outcome 7

Mapped to:

Measures & Findings

Page 14 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 247: Completed Academic Program Assessment Plans 20082009

USA- ABET-Criteria for Accrediting Engineering Programs: Element (h), Element (j)

College Assessment of Academic Proficiency (CAAP) Program level; Direct - Exam

Details/Description: College Assessment of Academic Proficiency (CAAP) is a standardized test that covers 5 sections – writing, reading, mathematics, science, and critical thinking. The completion of a section of the CAAP is a University requirement for graduation. The engineering program uses CAAP results to evaluate (1) proficiency in basic technical subjects that support engineering and (2) broad knowledge of subjects outside of engineering, such as social studies. Results in the “critical thinking” category is used to assess Outcome 7.

Target: Graduating Seniors from the Engineering Programs

Implementation Plan (timeline): Every Semester

Key/Responsible Personnel: Student, Administrative Assistant

Supporting Attachments:

Findings for College Assessment of Academic Proficiency (CAAP) Summary of Findings: The Outcome 7 CAAP results for UTC engineering students graduating during the 2007-2008 academic year are: Critical Thinking: UTC mean = 65.75 (n=12) which is 71.67 national percentile

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Graduating Senior Survey (EBI) Program level; Indirect - Survey

Details/Description: Educational Benchmark, Inc. (EBI) produces a standardized, national survey that is administered to students at engineering schools throughout the United States. The survey contains a number of questions relevant to the seven ENGR program outcomes. A member of the Engineering faculty analyzes and interprets the EBI provided data and reports the results. The responses from all engineering students are reviewed for the ENGR program since the seven outcomes being assessed are outcomes of all engineering programs and most of the knowledge and skills associated with the outcomes are developed in the core ENGR courses. UTC uses this survey to aid in evaluating ENGR program outcomes 2 through 7. An overall score for each outcome is formed by averaging the scores for the questions related to that outcome.

Target: All UTC Graduating Engineering Students The metric goals established for the EBI survey are that the overall UTC score for each outcome will exceed the overall average score for all participating institutions reported by

No Findings Added to Graduating Senior Survey (EBI)

Page 15 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 248: Completed Academic Program Assessment Plans 20082009

EBI Implementation Plan (timeline): Graduating seniors complete the EBI survey at the end of the fall and spring semester every year. Every other year the collection of surveys (from the past 2 years) are provided to EBI. The survey is scored by EBI. EBI provides UTC with the UTC specific scores (overall and major) as well as scores collected from other participating schools.

Key/Responsible Personnel: CECS Department Heads, UG Assistant Dean

Supporting Attachments:

Page 16 of 16Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 249: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Mechanical Engineering: BSME

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Mechanical Engineering: BSME Outcomes - for current students

Outcomes

1. Fundamentals engineering, math, sciences and computers

Mapped to: No Mapping

Measures & Findings

1st priority - Course Assessment Course level; Direct - Portfolio

Details/Description: ENGR assessment of ENGR 222, 270, 303, & 307 course files to find evidence Outcome 1

Target: On assessment form for the course being evaluated for Outcome 1 if student get > 2.0 on scale of 1.0 to 3.0 then that student work assess meets the outcome

Implementation Plan (timeline): Process performed every 2 years in a six year cycle

Key/Responsible Personnel: Engineering faculty assessment committee assigned

Supporting Attachments:

Findings for 1st priority - Course Assessment Summary of Findings: The students assessed in course met or exceeded the assessment mean of 2.0 on a 1.0 to 3.0 scale.

Target Achievement: Met

Recommendations : To continue the use of the assessment tools to monitor this student's success (> 2.0 mean) of this outcome.

Notes :

Substantiating Evidence:

2nd prioirty - Fundamentals of Engineering Exam Program level; Direct - Exam

Details/Description: Average of all examination category scores relevant to Outcome 1. Subjects are Chemistry, Computers, Dynamics Material Science, Math and Statics

Target: From a 0% to 100% scale, target for all ME majors to be 85% in all subjects related to Outcome 1

Implementation Plan (timeline): Evaluated annually in a six year cycle

Key/Responsible Personnel: ME department head and appointed ME faculty to help with the analysis

Supporting Attachments:

Findings for 2nd prioirty - Fundamentals of Engineering Exam Summary of Findings: The assessment in subject area met or exceed the national ME mean. All percentages greater than the 85% target except mathematics which was 82%.

Target Achievement: Met

Recommendations : To continue the FE review subject sessions every semester with a strengthted review in the mathematics.

Notes :

Substantiating Evidence:

Page 1 of 11Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 250: Completed Academic Program Assessment Plans 20082009

3rd priority - CAAP exam Program level; Direct - Exam

Details/Description: Annually, ME seniors take one part of the CAAP exit exam. The subject parts are writing, mathematics, science, reading and critical thinking.

Target: Seniors scores for each subject are compared to the national norms for each subject with the target of our ME seniors to meet and exceed the national averages

Implementation Plan (timeline): Annually

Key/Responsible Personnel: ME department head

Supporting Attachments:

Findings for 3rd priority - CAAP exam Summary of Findings: The UTC ME CAAP scores met or exceeded the national average for engineering students taking the CAAP test.

Target Achievement: Met

Recommendations : To continue the use of the assessment tools to monitor this student's success of this outcome. (meeting and exceeding the national average of students taking CAAP)

Notes :

Substantiating Evidence:

4th priority - EBI Senior Survey Program level; Indirect - Survey

Details/Description: Every year, each ME senior takes the EBI Senior survey. Specific survey questions are identified for Outcome 1 and are used to assess the outcome

Target: The target is for the mean scores for the UTC ME senior responses for the target questions supporting Outcome 1 to meet or exceed national mean scores for the total participating schools that are using the EBI senior survey.

Implementation Plan (timeline): Surveyed annually. Scores evaluated every two-years in a six year cycle.

Key/Responsible Personnel: ME Department head and appointed ME faculty to help with the evaluation.

Supporting Attachments:

Findings for 4th priority - EBI Senior Survey Summary of Findings: The UTC ME seniors expressed opinions concerning the questions mapped to this outcome that met and exceeded the responses from ME students from all universities participating in the EBI survey nationally.

Target Achievement: Met

Recommendations : To continue the evaluation of seniors on a continuing two-year cycle using the EBI survey results for UTC ME students and comparing them to the national ME results of those schools using the EBI survey for the same two-year cycle.

Notes :

Substantiating Evidence:

2. Engineering Tools

Mapped to: No Mapping

Measures & Findings

1st priority - Course Assessment Course level; Direct - Portfolio

Details/Description: ME assessment of ENME 304, 309, 347, 348, 442, 443, 447, & 450 course files to find evidence Outcome 2. Also, ENME 440 and 441 are used for students in the energy option, while ENME 445, 446, or 448 are used for student in the mechanical option.

Target: On assessment form for the course being evaluated for Outcome 2 if student get > 2.0 on scale of 1.0 to 3.0 then that student work assess meets the outcome

Implementation Plan (timeline):

Findings for 1st priority - Course Assessment Summary of Findings: The students assessed in course met or exceeded the assessment mean of 2.0 on a 1.0 to 3.0 scale.

Target Achievement: Met

Recommendations : To continue the use of the assessment tools to monitor this student's success (> 2.0 mean) of this outcome.

Notes :

Substantiating Evidence:

Page 2 of 11Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 251: Completed Academic Program Assessment Plans 20082009

Process performed every 2 years in a six year cycle

Key/Responsible Personnel: Engineering faculty assessment committee assigned

Supporting Attachments:

2nd priority - Fundamentals of Engineering Exam Program level; Direct - Exam

Details/Description: Average of all examination category scores relevant to Outcome 2. Subjects are Electrical Circuits, Thermodynamics, Fluid Mechanics, and Mechanics of Materials.

Target: From a 0% to 100% scale, target for all ME majors to be 85% in all subjects related to Outcome 2

Implementation Plan (timeline): Evaluated annually in a six year cycle

Key/Responsible Personnel: ME department head and appointed ME faculty to help with the analysis

Supporting Attachments:

Findings for 2nd priority - Fundamentals of Engineering Exam Summary of Findings: The assessment in subject areas met or exceed the national ME mean. All percentages greater than the 85% target.

Target Achievement: Met

Recommendations : To continue the FE review subject sessions every semester with a strengthted review in the mathematics and other targeted subject areas (where percentages fall below the 85% target).

Notes :

Substantiating Evidence:

3rd priority - EBI Senior Survey Program level; Indirect - Survey

Details/Description: Every year, each ME senior takes the EBI Senior survey. Specific survey questions are identified for Outcome 2 and are used to assess the outcome

Target: The target is for the mean scores for the UTC ME senior responses for the target questions supporting Outcome 2 to meet or exceed national mean scores for the total participating schools that are using the EBI senior survey.

Implementation Plan (timeline): Surveyed annually. Scores evaluated every two-years in a six year cycle.

Key/Responsible Personnel: ME Department head and appointed ME faculty to help with the evaluation.

Supporting Attachments:

Findings for 3rd priority - EBI Senior Survey Summary of Findings: The UTC ME seniors expressed opinions concerning the questions mapped to this outcome that met and exceeded the responses from ME students from all universities participating in the EBI survey nationally.

Target Achievement: Met

Recommendations : To continue the evaluation of seniors on a continuing two-year cycle using the EBI survey results for UTC ME students and comparing them to the national ME results of those schools using the EBI survey for the same two-year cycle.

Notes :

Substantiating Evidence:

3. Engineering Experimentation

Mapped to: No Mapping

Measures & Findings

1st priority - Course Assessment Course level; Direct - Portfolio

Details/Description: ME assessment of ENME 347 & 447 course files to find evidence Outcome 3.

Target: On assessment form for the course being evaluated for Outcome 3 if student get > 2.0 on scale of 1.0 to

Findings for 1st priority - Course Assessment Summary of Findings: The students assessed in course met or exceeded the assessment mean of 2.0 on a 1.0 to 3.0 scale.

Target Achievement: Met

Recommendations : To continue the

Page 3 of 11Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 252: Completed Academic Program Assessment Plans 20082009

3.0 then that student work assess meets the outcome

Implementation Plan (timeline): Process performed every 2 years in a six year cycle

Key/Responsible Personnel: Engineering faculty assessment committee assigned

Supporting Attachments:

use of the assessment tools to monitor this student's success (> 2.0 mean) of this outcome.

Notes :

Substantiating Evidence:

2nd priority - EBI Senior Survey Program level; Indirect - Survey

Details/Description: Every year, each ME senior takes the EBI Senior survey. Specific survey questions are identified for Outcome 3 and are used to assess the outcome

Target: The target is for the mean scores for the UTC ME senior responses for the target questions supporting Outcome 3 to meet or exceed national mean scores for the total participating schools that are using the EBI senior survey.

Implementation Plan (timeline): Surveyed annually. Scores evaluated every two-years in a six year cycle.

Key/Responsible Personnel: ME Department head and appointed ME faculty to help with the evaluation.

Supporting Attachments:

Findings for 2nd priority - EBI Senior Survey Summary of Findings: The UTC ME seniors expressed opinions concerning the questions mapped to this outcome that met and exceeded the responses from ME students from all universities participating in the EBI survey nationally.

Target Achievement: Met

Recommendations : To continue the evaluation of seniors on a continuing two-year cycle using the EBI survey results for UTC ME students and comparing them to the national ME results of those schools using the EBI survey for the same two-year cycle.

Notes :

Substantiating Evidence:

4. Engineering Design

Mapped to: No Mapping

Measures & Findings

1st priority - Course Assessment Course level; Direct - Portfolio

Details/Description: ENGR assessment of ENGR 185, 385 and 485 & ENME 442, 443, 447, & 450 course files to find evidence Outcome 4.

Target: On assessment form for the course being evaluated for Outcome 4 if student get > 2.0 on scale of 1.0 to 3.0 then that student work assess meets the outcome

Implementation Plan (timeline): Process performed every 2 years in a six year cycle

Key/Responsible Personnel: Engineering faculty assessment committee assigned

Supporting Attachments:

Findings for 1st priority - Course Assessment Summary of Findings: The students assessed in course met or exceeded the assessment mean of 2.0 on a 1.0 to 3.0 scale.

Target Achievement: Met

Recommendations : To continue the use of the assessment tools to monitor this student's success (> 2.0 mean) of this outcome.

Notes :

Substantiating Evidence:

2nd priority - EBI Senior Survey Program level; Indirect - Survey

Details/Description: Every year,

Findings for 2nd priority - EBI Senior Survey Summary of Findings: The UTC ME

Page 4 of 11Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 253: Completed Academic Program Assessment Plans 20082009

each ME senior takes the EBI Senior survey. Specific survey questions are identified for Outcome 4 and are used to assess the outcome

Target: The target is for the mean scores for the UTC ME senior responses for the target questions supporting Outcome 4 to meet or exceed national mean scores for the total participating schools that are using the EBI senior survey.

Implementation Plan (timeline): Surveyed annually. Scores evaluated every two-years in a six year cycle.

Key/Responsible Personnel: ME Department head and appointed ME faculty to help with the evaluation.

Supporting Attachments:

seniors expressed opinions concerning the questions mapped to this outcome that met and exceeded the responses from ME students from all universities participating in the EBI survey nationally.

Target Achievement: Met

Recommendations : To continue the evaluation of seniors on a continuing two-year cycle using the EBI survey results for UTC ME students and comparing them to the national ME results of those schools using the EBI survey for the same two-year cycle.

Notes :

Substantiating Evidence:

5. Engineering Communication and Team Building

Mapped to: No Mapping

Measures & Findings

1st priority - Course Assessment Course level; Direct - Portfolio

Details/Description: ENGR assessment of ENGR 185, 247, 329, 385 and 485 & course files to find evidence Outcome 5.

Target: On assessment form for the course being evaluated for Outcome 5 if student get > 2.0 on scale of 1.0 to 3.0 then that student work assess meets the outcome

Implementation Plan (timeline): Process performed every 2 years in a six year cycle

Key/Responsible Personnel: Engineering faculty assessment committee assigned

Supporting Attachments:

Findings for 1st priority - Course Assessment Summary of Findings: The students assessed in course met or exceeded the assessment mean of 2.0 on a 1.0 to 3.0 scale.

Target Achievement: Met

Recommendations : To continue the use of the assessment tools to monitor this student's success (> 2.0 mean) of this outcome.

Notes :

Substantiating Evidence:

2nd priority - EBI Senior Survey Program level; Indirect - Survey

Details/Description: Every year, each ME senior takes the EBI Senior survey. Specific survey questions are identified for Outcome 5 and are used to assess the outcome

Target: The target is for the mean scores for the UTC ME senior responses for the target questions supporting Outcome 5 to meet or exceed national mean scores for the total participating schools that are using the EBI senior survey.

Implementation Plan (timeline): Surveyed annually. Scores evaluated every two-years in a six year cycle.

Key/Responsible Personnel: ME Department head and appointed ME

Findings for 2nd priority - EBI Senior Survey Summary of Findings: The UTC ME seniors expressed opinions concerning the questions mapped to this outcome that met and exceeded the responses from ME students from all universities participating in the EBI survey nationally.

Target Achievement: Met

Recommendations : To continue the evaluation of seniors on a continuing two-year cycle using the EBI survey results for UTC ME students and comparing them to the national ME results of those schools using the EBI survey for the same two-year cycle.

Notes :

Page 5 of 11Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 254: Completed Academic Program Assessment Plans 20082009

faculty to help with the evaluation.

Supporting Attachments: Substantiating Evidence:

6. Engineering Ethics & Professional Development

Mapped to: No Mapping

Measures & Findings

1st priority - Course Assessment Course level; Direct - Portfolio

Details/Description: ENGR assessment of ENGR 485 & ENME 450 course files to find evidence Outcome 6.

Target: On assessment form for the course being evaluated for Outcome 6 if student get > 2.0 on scale of 1.0 to 3.0 then that student work assess meets the outcome

Implementation Plan (timeline): Process performed every 2 years in a six year cycle

Key/Responsible Personnel: Engineering faculty assessment committee assigned

Supporting Attachments:

Findings for 1st priority - Course Assessment Summary of Findings: The students assessed in course met or exceeded the assessment mean of 2.0 on a 1.0 to 3.0 scale.

Target Achievement: Met

Recommendations : To continue the use of the assessment tools to monitor this student's success (> 2.0 mean) of this outcome.

Notes :

Substantiating Evidence:

2nd priority - EBI Senior Survey Program level; Indirect - Survey

Details/Description: Every year, each ME senior takes the EBI Senior survey. Specific survey questions are identified for Outcome 6 and are used to assess the outcome

Target: The target is for the mean scores for the UTC ME senior responses for the target questions supporting Outcome 6 to meet or exceed national mean scores for the total participating schools that are using the EBI senior survey.

Implementation Plan (timeline): Surveyed annually. Scores evaluated every two-years in a six year cycle.

Key/Responsible Personnel: ME Department head and appointed ME faculty to help with the evaluation.

Supporting Attachments:

Findings for 2nd priority - EBI Senior Survey Summary of Findings: The UTC ME seniors expressed opinions concerning the questions mapped to this outcome that met and exceeded the responses from ME students from all universities participating in the EBI survey nationally.

Target Achievement: Met

Recommendations : To continue the evaluation of seniors on a continuing two-year cycle using the EBI survey results for UTC ME students and comparing them to the national ME results of those schools using the EBI survey for the same two-year cycle.

Notes :

Substantiating Evidence:

3rd priority - Fundamentals of Engineering Exam Program level; Direct - Exam

Details/Description: Average of all examination category scores relevant to Outcome 6. Subject is Ethics.

Target: From a 0% to 100% scale, target for all ME majors to be 85% in all subjects related to Outcome 6

Implementation Plan (timeline): Evaluated annually in a six year cycle

Key/Responsible Personnel: ME

Findings for 3rd priority - Fundamentals of Engineering Exam Summary of Findings: The assessment in subject area met or exceed the national ME mean. All percentages greater than the 85% target.

Target Achievement: Met

Recommendations : To continue the FE review subject sessions every

Page 6 of 11Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 255: Completed Academic Program Assessment Plans 20082009

department head and appointed ME faculty to help with the analysis

Supporting Attachments:

semester with a strengthted review in the mathematics and other targeted subject areas (where percentages fall below the 85% target).

Notes :

Substantiating Evidence:

7. Engineering in Global Societal Context

Mapped to: No Mapping

Measures & Findings

1st priority - Course Assessment Course level; Direct - Portfolio

Details/Description: ENME assessment of ENME 443 & ENME 450 course files to find evidence Outcome 7. For ME students in the energy option, ENME 441 is assessed, but for ME students in the mechanical option, ENME 445 is assessed.

Target: On assessment form for the course being evaluated for Outcome 7 if student get > 2.0 on scale of 1.0 to 3.0 then that student work assess meets the outcome

Implementation Plan (timeline): Process performed every 2 years in a six year cycle

Key/Responsible Personnel: Engineering faculty assessment committee assigned

Supporting Attachments:

Findings for 1st priority - Course Assessment Summary of Findings: The students assessed in course met or exceeded the assessment mean of 2.0 on a 1.0 to 3.0 scale.

Target Achievement: Met

Recommendations : To continue the use of the assessment tools to monitor this student's success (> 2.0 mean) of this outcome.

Notes :

Substantiating Evidence:

2nd priority - CAAP exam Program level; Direct - Exam

Details/Description: Annually, ME seniors take one part of the CAAP exit exam. The subject parts are writing, mathematics, science, reading and critical thinking.

Target: Seniors scores for each subject are compared to the national norms for each subject with the target of our ME seniors to meet and exceed the national averages

Implementation Plan (timeline): Annually

Key/Responsible Personnel: ME department head

Supporting Attachments:

Findings for 2nd priority - CAAP exam Summary of Findings: The UTC ME CAAP scores met or exceeded the national average for engineering students taking the CAAP test.

Target Achievement: Met

Recommendations : To continue the use of the assessment tools to monitor this student's success of this outcome. (meeting and exceeding the national average of students taking CAAP)

Notes :

Substantiating Evidence:

3rd priority - EBI Senior Survey Program level; Indirect - Survey

Details/Description: Every year, each ME senior takes the EBI Senior survey. Specific survey questions are identified for Outcome 7 and are used to assess the outcome

Target: The target is for the mean scores for the UTC ME senior

Findings for 3rd priority - EBI Senior Survey Summary of Findings: The UTC ME seniors expressed opinions concerning the questions mapped to this outcome that met and exceeded the responses from ME students from all universities participating in the EBI survey nationally.

Target Achievement: Met

Page 7 of 11Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 256: Completed Academic Program Assessment Plans 20082009

responses for the target questions supporting Outcome 7 to meet or exceed national mean scores for the total participating schools that are using the EBI senior survey.

Implementation Plan (timeline): Surveyed annually. Scores evaluated every two-years in a six year cycle.

Key/Responsible Personnel: ME Department head and appointed ME faculty to help with the evaluation.

Supporting Attachments:

Recommendations : To continue the evaluation of seniors on a continuing two-year cycle using the EBI survey results for UTC ME students and comparing them to the national ME results of those schools using the EBI survey for the same two-year cycle.

Notes :

Substantiating Evidence:

8. Engineering Skills to Model Systems

Mapped to: No Mapping

Measures & Findings

1st priority - Course Assessment Course level; Direct - Portfolio

Details/Description: ENME assessment of ENME 309,443, & 447 course files to find evidence Outcome 8. For ME students in the energy option, ENME 440 is assessed, but for ME students in the mechanical option, ENME 445 is assessed.

Target: On assessment form for the course being evaluated for Outcome 8 if student get > 2.0 on scale of 1.0 to 3.0 then that student work assess meets the outcome

Implementation Plan (timeline): Process performed every 2 years in a six year cycle

Key/Responsible Personnel: Engineering faculty assessment committee assigned

Supporting Attachments:

Findings for 1st priority - Course Assessment Summary of Findings: The students assessed in course met or exceeded the assessment mean of 2.0 on a 1.0 to 3.0 scale.

Target Achievement: Met

Recommendations : To continue the use of the assessment tools to monitor this student's success (> 2.0 mean) of this outcome.

Notes :

Substantiating Evidence:

2nd priority - EBI Senior Survey Program level; Indirect - Survey

Details/Description: Every year, each ME senior takes the EBI Senior survey. Specific survey questions are identified for Outcome 8 and are used to assess the outcome

Target: The target is for the mean scores for the UTC ME senior responses for the target questions supporting Outcome 8 to meet or exceed national mean scores for the total participating schools that are using the EBI senior survey.

Implementation Plan (timeline): Surveyed annually. Scores evaluated every two-years in a six year cycle.

Key/Responsible Personnel: ME Department head and appointed ME faculty to help with the evaluation.

Supporting Attachments:

Findings for 2nd priority - EBI Senior Survey Summary of Findings: The UTC ME seniors expressed opinions concerning the questions mapped to this outcome that met and exceeded the responses from ME students from all universities participating in the EBI survey nationally.

Target Achievement: Met

Recommendations : To continue the evaluation of seniors on a continuing two-year cycle using the EBI survey results for UTC ME students and comparing them to the national ME results of those schools using the EBI survey for the same two-year cycle.

Notes :

Substantiating Evidence:

Page 8 of 11Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 257: Completed Academic Program Assessment Plans 20082009

9. Engineering Skills to Analyze Systems

Mapped to: No Mapping

Measures & Findings

1st priority - Course Assessment Course level; Direct - Portfolio

Details/Description: ENME assessment of ENME 309, 347, 348, 442, & 443 course files to find evidence Outcome 9. For ME students in the energy option, ENME 440 is assessed, but for ME students in the mechanical option, ENME 445 or ENME 448 is assessed.

Target: On assessment form for the course being evaluated for Outcome 9 if student get > 2.0 on scale of 1.0 to 3.0 then that student work assess meets the outcome

Implementation Plan (timeline): Process performed every 2 years in a six year cycle

Key/Responsible Personnel: Engineering faculty assessment committee assigned

Supporting Attachments:

Findings for 1st priority - Course Assessment Summary of Findings: The students assessed in course met or exceeded the assessment mean of 2.0 on a 1.0 to 3.0 scale.

Target Achievement: Met

Recommendations : To continue the use of the assessment tools to monitor this student's success (> 2.0 mean) of this outcome.

Notes :

Substantiating Evidence:

2nd priority - Fundamentals of Engineering Exam Program level; Direct - Exam

Details/Description: Average of all examination category scores relevant to Outcome 9. Subjecta are Chemistry, Computers, Dynamics, Material Science, Mathematics, Statics, Electrical Circuits, Thermodynamics, Fluid Mechanics, Mechanics of Materials, and Ethics.

Target: From a 0% to 100% scale, target for all ME majors to be 85% in all subjects related to Outcome 9

Implementation Plan (timeline): Evaluated annually in a six year cycle

Key/Responsible Personnel: ME department head and appointed ME faculty to help with the analysis

Supporting Attachments:

Findings for 2nd priority - Fundamentals of Engineering Exam Summary of Findings: The assessment in subject area met or exceed the national ME mean. All percentages greater than the 85% target.

Target Achievement: Met

Recommendations : To continue the FE review subject sessions every semester with a strengthted review in the mathematics and other targeted subject areas (where percentages fall below the 85% target).

Notes :

Substantiating Evidence:

3rd priority - EBI Senior Survey Program level; Indirect - Survey

Details/Description: Every year, each ME senior takes the EBI Senior survey. Specific survey questions are identified for Outcome 9 and are used to assess the outcome

Target: The target is for the mean scores for the UTC ME senior responses for the target questions supporting Outcome 9 to meet or exceed national mean scores for the total participating schools that are

Findings for 3rd priority - EBI Senior Survey Summary of Findings: The UTC ME seniors expressed opinions concerning the questions mapped to this outcome that met and exceeded the responses from ME students from all universities participating in the EBI survey nationally.

Target Achievement: Met

Recommendations : To continue the evaluation of seniors on a continuing two-year cycle using the EBI survey

Page 9 of 11Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 258: Completed Academic Program Assessment Plans 20082009

using the EBI senior survey.

Implementation Plan (timeline): Surveyed annually. Scores evaluated every two-years in a six year cycle.

Key/Responsible Personnel: ME Department head and appointed ME faculty to help with the evaluation.

Supporting Attachments:

results for UTC ME students and comparing them to the national ME results of those schools using the EBI survey for the same two-year cycle.

Notes :

Substantiating Evidence:

10. Engineering Skills to Design Systems

Mapped to: No Mapping

Measures & Findings

1st priority - Course Assessment Course level; Direct - Portfolio

Details/Description: ENME assessment of ENME 442, 443 & 450 course files to find evidence Outcome 10.

Target: On assessment form for the course being evaluated for Outcome 10 if student get > 2.0 on scale of 1.0 to 3.0 then that student work assess meets the outcome

Implementation Plan (timeline): Process performed every 2 years in a six year cycle

Key/Responsible Personnel: Engineering faculty assessment committee assigned

Supporting Attachments:

Findings for 1st priority - Course Assessment Summary of Findings: The students assessed in course met or exceeded the assessment mean of 2.0 on a 1.0 to 3.0 scale.

Target Achievement: Met

Recommendations : To continue the use of the assessment tools to monitor this student's success (> 2.0 mean) of this outcome.

Notes :

Substantiating Evidence:

2nd priority - EBI Senior Survey Program level; Indirect - Survey

Details/Description: Every year, each ME senior takes the EBI Senior survey. Specific survey questions are identified for Outcome 10 and are used to assess the outcome

Target: The target is for the mean scores for the UTC ME senior responses for the target questions supporting Outcome 10 to meet or exceed national mean scores for the total participating schools that are using the EBI senior survey.

Implementation Plan (timeline): Surveyed annually. Scores evaluated every two-years in a six year cycle.

Key/Responsible Personnel: ME Department head and appointed ME faculty to help with the evaluation.

Supporting Attachments:

Findings for 2nd priority - EBI Senior Survey Summary of Findings: The UTC ME seniors expressed opinions concerning the questions mapped to this outcome that met and exceeded the responses from ME students from all universities participating in the EBI survey nationally.

Target Achievement: Met

Recommendations : To continue the evaluation of seniors on a continuing two-year cycle using the EBI survey results for UTC ME students and comparing them to the national ME results of those schools using the EBI survey for the same two-year cycle.

Notes :

Substantiating Evidence:

11. Engineering Professionalism

Mapped to: No Mapping

Measures & Findings

1st priority - Course Assessment Course level; Direct - Portfolio

Findings for 1st priority - Course Assessment

Page 10 of 11Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 259: Completed Academic Program Assessment Plans 20082009

Details/Description: ENME assessment of ENME 442, 443, 447 & 450 course files to find evidence Outcome 11.

Target: On assessment form for the course being evaluated for Outcome 11 if student get > 2.0 on scale of 1.0 to 3.0 then that student work assess meets the outcome

Implementation Plan (timeline): Process performed every 2 years in a six year cycle

Key/Responsible Personnel: Engineering faculty assessment committee assigned

Supporting Attachments:

Summary of Findings: The students assessed in course met or exceeded the assessment mean of 2.0 on a 1.0 to 3.0 scale.

Target Achievement: Met

Recommendations : To continue the use of the assessment tools to monitor this student's success (> 2.0 mean) of this outcome.

Notes :

Substantiating Evidence:

2nd priority - EBI Senior Survey Program level; Indirect - Survey

Details/Description: Every year, each ME senior takes the EBI Senior survey. Specific survey questions are identified for Outcome 11 and are used to assess the outcome

Target: The target is for the mean scores for the UTC ME senior responses for the target questions supporting Outcome 11 to meet or exceed national mean scores for the total participating schools that are using the EBI senior survey.

Implementation Plan (timeline): Surveyed annually. Scores evaluated every two-years in a six year cycle.

Key/Responsible Personnel: ME Department head and appointed ME faculty to help with the evaluation.

Supporting Attachments:

Findings for 2nd priority - EBI Senior Survey Summary of Findings: The UTC ME seniors expressed opinions concerning the questions mapped to this outcome that met and exceeded the responses from ME students from all universities participating in the EBI survey nationally.

Target Achievement: Met

Recommendations : To continue the evaluation of seniors on a continuing two-year cycle using the EBI survey results for UTC ME students and comparing them to the national ME results of those schools using the EBI survey for the same two-year cycle.

Notes :

Substantiating Evidence:

Page 11 of 11Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 260: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Civil Engineering: BS

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Civil Engineering: BS Outcome Set

Outcome

Fundamental Knowledge

Mapped to: No Mapping

Measures & Findings

Fundamental Knowledge Other level; Direct - Exam

Details/Description: The Fundamentals of Engineering (FE) examination is a nationally administered test developed by the National Council of Examiners for Engineering and Surveying (NCEES). The FE exam covers subject matter taught in a typical baccalaureate engineering program. All CE students at UTC are strongly encouraged to take the FE exam toward the end of their senior year.

Target: Our students will score 85% of the national average

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: ABET Coordinator within the CE department

Supporting Attachments:

Findings for Fundamental Knowledge Summary of Findings: Our students scored 97% of the national average. Our students scored well above the target of 85%.

Target Achievement: Exceeded

Recommendations : No curricular improvements are recommended at this time.

Notes :

Substantiating Evidence:

Fundamental Knowledge Course level; Direct - Portfolio

Details/Description: A key tool in the assessment of program outcomes is the evaluation of courses within the curriculum. Learning objectives have been established for each course, and these objectives have been mapped to corresponding program outcomes. Upon completion of a course, the instructor assembles a course folder. The course folder is then evaluated by at least three faculty members.

Target: Portfolio's accessed must score greater than 2 on a scale of 1 to 3, with 3 being the highest.

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: ABET

Findings for Fundamental Knowledge Summary of Findings: Average score on the portfolio's was 2.1 which meets the set target of 2.0.

Target Achievement: Exceeded

Recommendations : No curricular improvements are required at the present time.

Notes :

Substantiating Evidence:

Page 1 of 6Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 261: Completed Academic Program Assessment Plans 20082009

coordinator within the CE department

Supporting Attachments:

Contemporary, Societal and Global Issues

Mapped to: No Mapping

Measures & Findings

Contemporary, Societal and Global Issues Other level; Indirect - Survey

Details/Description: Educational Benchmark, Inc. (EBI) produces a standardized, national survey that is administered to students at engineering schools throughout the United States. Graduating seniors complete the EBI survey at the end of the fall and spring semester every year.

Target: Our students will score 90% of participating schools' average score

Implementation Plan (timeline): Once a year Key/Responsible Personnel: ABET coordinator within the CE department Supporting Attachments:

Findings for Contemporary, Societal and Global Issues Summary of Findings: Our students scored 106% of the national average. Our students scored well above the target of 90%.

Target Achievement: Exceeded

Recommendations : No curricular improvements are recommended at this time.

Notes :

Substantiating Evidence:

Contemporary, Societal and Global Issues Course level; Direct - Portfolio

Details/Description: A key tool in the assessment of program outcomes is the evaluation of courses within the curriculum. Learning objectives have been established for each course, and these objectives have been mapped to corresponding program outcomes. Upon completion of a course, the instructor assembles a course folder. The course folder is then evaluated by at least three faculty members.

Target: Portfolio's accessed must score greater than 2 on a scale of 1 to 3, with 3 being the highest.

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: ABET coordinator within the CE department

Supporting Attachments:

Findings for Contemporary, Societal and Global Issues Summary of Findings: Average score on the portfolio's was 2.0 which meets the set target of 2.0. Target Achievement: Met

Recommendations : No curricular improvements are required at the present time.

Notes :

Substantiating Evidence:

Modern Engineering Tools

Mapped to: No Mapping

Measures & Findings

Modern Engineering Tools Program level; Direct - Exam

Details/Description: The Fundamentals of Engineering (FE) examination is a nationally administered test developed by the National Council of Examiners for

Findings for Modern Engineering Tools Summary of Findings: Our students scored 100% of the national average. Our students scored well above the target of 85%.

Page 2 of 6Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 262: Completed Academic Program Assessment Plans 20082009

Engineering and Surveying (NCEES). The FE exam covers subject matter taught in a typical baccalaureate engineering program. All CE students at UTC are strongly encouraged to take the FE exam toward the end of their senior year.

Target: All students will score 85% of national average score

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: ABET coordinator within the CE department

Supporting Attachments:

Target Achievement: Exceeded

Recommendations : No curricular improvements are required at the present time.

Notes :

Substantiating Evidence:

Modern Engineering Tools Course level; Direct - Portfolio

Details/Description: A key tool in the assessment of program outcomes is the evaluation of courses within the curriculum. Learning objectives have been established for each course, and these objectives have been mapped to corresponding program outcomes. Upon completion of a course, the instructor assembles a course folder. The course folder is then evaluated by at least three faculty members.

Target: Portfolio's accessed must score greater than 2 on a scale of 1 to 3, with 3 being the highest.

Implementation Plan (timeline): Once a year Key/Responsible Personnel: ABET coordinator within the CE department

Supporting Attachments:

Findings for Modern Engineering Tools Summary of Findings: Average score on the portfolio's was 2.2 which meets the set target of 2.0.

Target Achievement: Exceeded

Recommendations : No curricular improvements are required at the present time.

Notes :

Substantiating Evidence:

Effective Communication and Team Work

Mapped to: No Mapping

Measures & Findings

Effective Communication and Team Work Other level; Indirect - Survey

Details/Description: Educational Benchmark, Inc. (EBI) produces a standardized, national survey that is administered to students at engineering schools throughout the United States. Graduating seniors complete the EBI survey at the end of the fall and spring semester every year.

Target: Our students will score 90% of participating schools' average score

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: ABET coordinator within the CE department

Supporting Attachments:

Findings for Effective Communication and Team Work Summary of Findings: Our students scored 108% of the national average. Our students scored well above the target of 90%.

Target Achievement: Exceeded

Recommendations : No curricular improvements are recommended at this time.

Notes :

Substantiating Evidence:

Effective Communication and Findings for Effective

Page 3 of 6Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 263: Completed Academic Program Assessment Plans 20082009

Team Work Course level; Direct - Portfolio

Details/Description: A key tool in the assessment of program outcomes is the evaluation of courses within the curriculum. Learning objectives have been established for each course, and these objectives have been mapped to corresponding program outcomes. Upon completion of a course, the instructor assembles a course folder. The course folder is then evaluated by at least three faculty members.

Target: Portfolio's accessed must score greater than 2 on a scale of 1 to 3, with 3 being the highest.

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: ABET coordinator within the CE department

Supporting Attachments:

Communication and Team Work Summary of Findings: Average score on the portfolio's was 2.2 which exceeds the set target of 2.0. Target Achievement: Exceeded

Recommendations : No curricular improvements are required at the present time.

Notes :

Substantiating Evidence:

Ethical Responsibilty and Professional Societies

Mapped to: No Mapping

Measures & Findings

Ethical Responsibility and Professional Societies Other level; Direct - Exam

Details/Description: The Fundamentals of Engineering (FE) examination is a nationally administered test developed by the National Council of Examiners for Engineering and Surveying (NCEES). The FE exam covers subject matter taught in a typical baccalaureate engineering program. All CE students at UTC are strongly encouraged to take the FE exam toward the end of their senior year.

Target: All students will score 85% of national average score

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: ABET coordinator within the CE department

Supporting Attachments:

Findings for Ethical Responsibility and Professional Societies Summary of Findings: Our students scored 94% of the national average. Our students scored well above the target of 85%.

Target Achievement: Exceeded

Recommendations : No curricular improvements at this time. Notes :

Substantiating Evidence:

Ethical Responsibility and Professional Societies Other level; Indirect - Survey

Details/Description: Educational Benchmark, Inc. (EBI) produces a standardized, national survey that is administered to students at engineering schools throughout the United States. Graduating seniors complete the EBI survey at the end of the fall and spring semester every year. Target: Our students will score 90%

Findings for Ethical Responsibility and Professional Societies Summary of Findings: Our students scored 94% of the national average. Our students scored well above the target of 90%.

Target Achievement: Exceeded

Recommendations : No curricular improvements are recommended at this time.

Notes :

Substantiating Evidence:

Page 4 of 6Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 264: Completed Academic Program Assessment Plans 20082009

of participating schools' average score

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: ABET coordinator within the CE department

Supporting Attachments:

Plan and Conduct Experiments

Mapped to: No Mapping

Measures & Findings

Plan and Conduct Experiments Other level; Indirect - Survey

Details/Description: Educational Benchmark, Inc. (EBI) produces a standardized, national survey that is administered to students at engineering schools throughout the United States. Graduating seniors complete the EBI survey at the end of the fall and spring semester every year.

Target: Our students will score 90% of participating schools' average score

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: ABET coordinator within the CE department

Supporting Attachments:

Findings for Plan and Conduct Experiments Summary of Findings: Our students scored 110% of the national average. Our students scored well above the target of 90%.

Target Achievement: Exceeded

Recommendations : No curricular improvements are recommended at this time.

Notes :

Substantiating Evidence:

Plan and Conduct Experiments Course level; Direct - Portfolio

Details/Description: A key tool in the assessment of program outcomes is the evaluation of courses within the curriculum. Learning objectives have been established for each course, and these objectives have been mapped to corresponding program outcomes. Upon completion of a course, the instructor assembles a course folder. The course folder is then evaluated by at least three faculty members.

Target: Portfolio's accessed must score greater than 2 on a scale of 1 to 3, with 3 being the highest. Implementation Plan (timeline): Once a year Key/Responsible Personnel: ABET coordinator within the CE department

Supporting Attachments:

Findings for Plan and Conduct Experiments Summary of Findings: Average score on the portfolio's was 2.2 which exceeds the set target of 2.0. Target Achievement: Exceeded

Recommendations : No curricular improvements are required at the present time. Notes :

Substantiating Evidence:

Major Dseign in Civil Engineering

Mapped to: No Mapping

Measures & Findings

Major Design in Civil Engineering Course level; Direct - Portfolio

Details/Description: A key tool in the assessment of program outcomes

Findings for Major Design in Civil Engineering Summary of Findings: Average score on the portfolio's was 1.9 which barley meets the set target of 2.0.

Page 5 of 6Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 265: Completed Academic Program Assessment Plans 20082009

is the evaluation of courses within the curriculum. Learning objectives have been established for each course, and these objectives have been mapped to corresponding program outcomes. Upon completion of a course, the instructor assembles a course folder. The course folder is then evaluated by at least three faculty members.

Target: Portfolio's accessed must score greater than 2 on a scale of 1 to 3, with 3 being the highest.

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: ABET coordinator within the CE department Supporting Attachments:

Target Achievement: Not Met

Recommendations : No curricular improvements at this time. The faculty adopted a wait and see approach. The department will continue to monitor and this outcome will be assessed in the next cycle of assessment.

Notes :

Substantiating Evidence:

Major Design in Civil Engineering Other level; Indirect - Survey

Details/Description: Educational Benchmark, Inc. (EBI) produces a standardized, national survey that is administered to students at engineering schools throughout the United States. Graduating seniors complete the EBI survey at the end of the fall and spring semester every year.

Target: Our students will score 90% of participating schools' average score

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: ABET coordinator within the CE department

Supporting Attachments:

Findings for Major Design in Civil Engineering Summary of Findings: Our students scored 110% of the national average. Our students scored well above the target of 90%.

Target Achievement: Exceeded

Recommendations : No curricular improvements are recommended at this time.

Notes :

Substantiating Evidence:

Page 6 of 6Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 266: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Computational Engineering: PhD

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

Computational Engineering: PhD Outcome Set

Outcomes

Academic Preparation

Mapped to: No Mapping

Measures & Findings

Competency in the Three Computational Engineering Core Areas Program level; Direct - Exam

Details/Description: At or near the completion of all coursework, students are given a comprehensive examination on material from the three core areas of computational engineering. This review and exam provides a global perspective of coursework, especially in relation to the dissertation research project. These exams cover four of the following five major topics: (1) computational fluid dynamics, (2) physical fluid dynamics, (3) mathematics of computation, (4) parallel scientific computing, and (5) grid generation. All students’ exams include material from topics (1) and (3); the other two areas are determined by the student and their major advisor, based on the dissertation research area.

Target: All students score very good or excellent (4+) in 3 exams and satisfactory (3+) in the 4th (1-5 scale).

Implementation Plan (timeline): Assessment is made at the time of the preliminary exam.

Key/Responsible Personnel: Program coordinator and faculty examiners.

Supporting Attachments:

Findings for Competency in the Three Computational Engineering Core Areas Summary of Findings: 1) Students with non-engineering backgrounds need exposure to engineering problem solving as well as some important physics-related topics related to computational field simulation. 2) Computational engineering coursework is heavily project oriented and requires at an intermediate level of scientific programming skills. Many new students struggle with completing their course-related projects because of a lack of programming skills. 3) Most new students needed introduction to computational engineering as soon as possible. 4) Given that this is an innovative interdisciplinary program in which no undergraduate major provides complete preparation for the program, it was found that a systematic and thorough orientation and advisement process was needed for incoming students. 5). Faculty have noticed that a small number of students have had difficulty with various mathematical manipulations and derivations requiring skills that should have been mastered in undergraduate mathematics courses.

Target Achievement: Met

Recommendations : 1) Based on the advisement process, students needing

Page 1 of 4Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 267: Completed Academic Program Assessment Plans 20082009

more background are required to take 2-3 undergraduate engineering courses in vector statics, fluid mechanics, and thermodynamics (these courses are not counted as part of students' graduate degree). 2) A new 1-hour course was introduced to provide basic orientation and instruction in computer skills needed early in the program. 3) All new students are given introductory projects in simulation-related programming designed to provide practical experience in formulating a numerical algorithm, implementing the algorithm in software, compiling and debugging the software, and interpreting computed results. Students are to complete these projects during the summer that follows their first two, full-time semesters. 4) An orientation process for each individual student was developed and implemented by the Program Coordinator. The orientation includes a formal advisement interview and checklist, a SimCenter tour, assignment of cubical space, authorization for SimCenter computer access, and a booklet “Process Information for New Students” explaining administrative processes such as registration and payroll encountered by new SimCenter students. Early in the first semester, the Department Head also gives a PowerPoint presentation “New Student Orientation” to all new students covering 1) program objectives, 2) building, 3) computer resources, 4) expectations of faculty and students, 5) academic requirements, 6) research requirements, 7) faculty and staff, and 8) the summer project. To reinforce all three outcome objectives, the program’s working environment that includes students in the team research conducted in the SimCenter is explained. There is a mutual expectation among SimCenter faculty and students that students are expected to have meaningful technical interactions with multiple faculty and with other students. 5). Beginning in the Fall 2010 semester, a mathematics review test will be given during the new-student orientation and advisement process to provide early diagnosis of any weaknesses relevant to the curriculum.

Notes :

Page 2 of 4Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 268: Completed Academic Program Assessment Plans 20082009

Substantiating Evidence:

Communication and Teamwork

Mapped to: No Mapping

Measures & Findings

Assessing Teamwork Program level; Indirect - Survey

Details/Description: The program monitors and assesses this outcome by obtaining individual evaluations of each student from all SimCenter academic and research faculty in the program. An anecdotal survey of faculty documents both significant/regular and occasional faculty/student interactions that relate specifically to research activities, as opposed to normal student-instructor interactions associated with coursework.

Target: All students having significant/regular research interactions with at least two faculty, and occasional interaction with at least two faculty.

Implementation Plan (timeline): Survey is conducted yearly near the end of the Spring semester.

Key/Responsible Personnel: Program coordinator.

Supporting Attachments:

Findings for Assessing Teamwork Summary of Findings: Both Ph.D. and M.S. (Thesis) students had regular research interactions with about 2 faculty and occasional interactions with an additional 4-5 faculty. Current students (Fall 2009) with 1 or more years in the program are also developing significant interaction with multiple faculty members. On average, non-thesis students had significantly less faculty interaction and less research progress than other students, which is consistent with their election of the non-thesis option.

Target Achievement: Met

Recommendations : Results obtained thus far are viewed as excellent outcomes for graduates and evidence of success in meeting this program objective.

Notes :

Substantiating Evidence:

Independent Research

Mapped to: No Mapping

Measures & Findings

Assessment of Research Progress Program level; Indirect - Survey

Details/Description: Informal survey of faculty members' perception of students' progress with research.

Target: 75% rated as "Making Excellent Progress" and 25% rated as "Expected to Make Progress."

Implementation Plan (timeline): Assessment is made annually around the end of spring semester.

Key/Responsible Personnel: Program coordinator.

Supporting Attachments:

Findings for Assessment of Research Progress Summary of Findings: 77% of the faculty who reported significant/regular or occasional interactions with the 10 Ph.D. students who have graduated assessed their research as “Making Excellent Progress,” whereas 21% of faculty reported these students as “Expected to Make Progress.”

Target Achievement: Met

Recommendations : Although this is a small sample, these results are viewed as excellent outcomes for graduates and evidence of success in meeting this program objective.

Notes :

Substantiating Evidence:

Quality of Dissertation Research Program level; Direct - Other

Details/Description: Students submit a written dissertation that is reviewed by the major advisor and

Findings for Quality of Dissertation Research Summary of Findings: The results of the anecdotal assessment by faculty of dissertation quality and student

Page 3 of 4Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 269: Completed Academic Program Assessment Plans 20082009

committee members. Students present and defend their dissertation in a general audience, after which the faculty and committee members conduct an oral exam to complete their evaluation of the dissertation. The Program Coordinator subsequently consults with faculty and records an anecdotal assessment of the dissertation quality and student research expertise.

Target: 75% of dissertations rated as "A" and 25% rated as "B".

Implementation Plan (timeline): Assessment is made following committee approval of the dissertation.

Key/Responsible Personnel: Program coordinator and Dissertation committee.

Supporting Attachments:

research expertise for the ten Ph.D. students who have graduated are: A (70%) and B (30%).

Target Achievement: Met

Recommendations : Although the quantitative goals of 75% (A) and 25% (B) were technically not met, the number of samples observed thus far is considered small and the actual 70%-30% findings are considered excellent outcomes for students and evidence that this program objective is being achieved.

Notes :

Substantiating Evidence:

Page 4 of 4Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 270: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Computer Science

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

(A - K) ABET Outcomes for Computer Science

Outcome

Outcome A

Mapped to: No Mapping

Measures & Findings

CAAP Program level; Direct - Exam

Details/Description: Portions of the CAAP test are required of all graduates of UTC. Students are randomly chosen to take different modules of the test: Reading, Writing Skills, Writing Essay, Mathematics, Science, Critical Thinking. The results of each graduating class on these test modules are used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester.

Target: Senior Students

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: Department Head

Supporting Attachments:

Findings for CAAP Summary of Findings: All UTC graduating seniors are required to take the Collegiate Assessment of Academic Proficiency (CAAP),, which provides measures of student General Education achievement. The CAAP, developed by the ACT organization, is a standardized, nationally-normed test which measures proficiency of five elements of UTC’s general education program. ACT is a nonprofit corporation, founded in 1959, which offers a wide variety of assessment instruments and services, including the ACT test required by many colleges and universities for admission. More information on the CAAP test may be found at http://www.act.org/caap/, which is the source of the information in this report. As described on the web site http://www.utc.edu/Administration/PlanningEvaluationAndInstitutionalResearch/caap.php, graduating seniors are required to take the CAAP test as a condition of graduation from UTC, but each student is randomly assigned to be tested in only one of the five areas: writing skills, mathematics, reading, science, and critical thinking. Results are reported for the Computer Science program, the College of Engineering and Computer Science, and UTC as a whole, allowing comparisons among majors at UTC. National statistics also allow comparison of UTC students’ level of proficiency with that of students in a variety of universities across the United States. The original plan was that CS majors would be compared to all college majors, then all UTC students, then to national averages. However, the small number of CS graduates makes this impracticable. It would take a minimum of five graduates to have even one student taking each section of the test, and since students are randomly assigned to sections, having five students take the test would not insure that each section would provide a even a single result. This would not allow any results obtained to be statistically significant. Even looking at graduates of the college of Engineering and Computer Science still provides a very small sample size. However, since our students take the same general education courses as the other students at UTC, and since our students are, based on incoming qualifications, at least as well qualified as those in other majors, we feel justified in examining the performance of UTC students as compared to national norms, and extrapolating those results to our CS majors, to provide one measure of how well the outcomes covered by this test are met by our students. These results are not the only measure we apply, but since our other measures, such as surveys, are subjective, this objective measure has been retained, although it is imperfect. We will continue to use the information which CAAP provides, while continuing to search for any additional objective measures of this content. The only results currently available are those of the 2008 and 2009 exams.

Target Achievement: Met

Recommendations : More frequent exams.

Notes :

Substantiating Evidence:

MFT Direct - Exam

Details/Description: The Major Field Test is given to seniors every

Findings for MFT Summary of Findings: The Computer Science Major Field Test was developed by the Educational Testing Service, the nonprofit organization behind such seminal, well-respected exams as the GRE, the SAT, and PRAXIS. The computer science test is a two-

Page 1 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 271: Completed Academic Program Assessment Plans 20082009

two years. The results from the three subsections (Programming, Discrete Structures and Algorithms and Systems) are also used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester.

Target: Senior Students

Implementation Plan (timeline): Every 3 years

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

hour, multiple choice examination designed to test content knowledge in the areas of discrete structures (at least 15 percent), programming (at least 21 percent), algorithms and complexity (at least 16 percent), systems (at least 16 percent), software engineering (at least 3 percent), information management (at least 3 percent), and other topics such as intelligent systems, professional issues, and human computer interaction. Results are provided for individual test takers and for the university computer science program as a whole. Assessment indicators, in the form of average percentage of correct answers in particular content areas, supply feedback as to whether students in a program are having difficulty in a given subject area. In addition, a Comparative Data Guide is generated each year, which allows a program to compare its performance to that of the numerous other programs, ranging from small liberal arts college to large state research universities, which participate in the exam nationally. Information on the MFT is taken from www.ets.org, and more information may be found at this site. Due to cost constraints, this test was formerly given to graduating seniors only once every five years. However, due to a recognition of the need for more objective measurement of how well students in our program achieve outcomes and compare to those at other universities, it was agreed in 2008 that this test would be taken by graduating seniors every second year, beginning Spring 2008. The test covers Programming Fundamentals, Computer Organization and Architectures, Operating Systems, Algorithms, Computer Science Theory, and Computational Mathematics. Results of the test in the past have been used as an informal measure of program success, but no formal process was in place. For evaluation of results, a process was proposed by which the Assessment Committee would compare our students’ scores, as a group, to the national averages using a comparison of the number of standard deviations away from the national averages our students were in different categories. Any areas in which our students scored more than one standard deviation below the national mean would be brought to the attention of the full CSE faculty at the annual fall retreat for discussion. As the number of students currently in the program is small, setting percentage goals for scores at this point was thought to be impractical. However, each year, the evaluation results were to be examined for trends and evidence of weaknesses in the program by the Assessment Committee, and appropriate recommendations will be made to the CSE faculty. As the number of students in the program increases, use of numerical targets would be reexamined.

Target Achievement: Met

Recommendations : Results of the test are given and compared to previous results

Notes :

Substantiating Evidence:

Program Rubrics Program level; Direct - Student Artifact

Details/Description: Rubrics were identified for each A-K/L Outcome by the faculty. Each rubric is designed to assess one program outcome A-K/L and the extent that each program outcome was met was measured based on the course outcome competencies listed in each program outcome.

Target: All students

Implementation Plan (timeline): Every semester

Key/Responsible Personnel: Dept. Head/Assessment Coordinator

Supporting Attachments:

Findings for Program Rubrics Summary of Findings: The course student surveys allow us to capture the students’ input into the assessment process. The course artifacts provide the instructor’s measure of student competencies. The assessment team’s evaluation of the course artifacts and evaluation of each Outcome A-K/L gives us a measure of the effectiveness of our instruction and the extent to which students meet the set objectives and program outcomes.

Target Achievement: Exceeded

Recommendations : We are fine tuning the rubrics.

Notes :

Substantiating Evidence:

Student/Course Survey Course level; Direct - Exam

Details/Description:

Findings for Student/Course Survey Summary of Findings: The results of the student survey and portfolios of typical student work are included with the course report. The portfolios consists of all of the exercises and exams for at least three students – one excellent (A) student, one average (B-C)

Page 2 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 272: Completed Academic Program Assessment Plans 20082009

For each core course a team composed of those faculty who have special expertise with the course (they have taught the course or subsequent courses that have prerequisite competencies from the course) prepares a list of course objectives and a student survey designed to gauge the students’ perception of their attainment of the course objectives. The team identifies the mapping of course objectives to Outcome competencies. The team reviews the course objectives and mapping each spring or more often if review is requested by faculty teaching the course.

Target:

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: Head of Department/Assessment Coordinator

Supporting Attachments:

student and one poor (D-F) student – is included in the course report. The student survey results and portfolios in the course reports are used in the course review and evaluation of the appropriate competencies for Program Outcomes A-K/L.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome B

Mapped to: No Mapping

Measures & Findings

Alumni Survey Program level; Indirect - Survey

Details/Description: The survey of alumni is the primary means of measuring the opinions of the alumni on how well the outcomes were achieved at the time of graduation, from a more mature perspective, provide a valuable indicator of outcomes as well. The survey is a contneous online survey with enough security parameters to allow only legitimate alumni.

Target: Alumni Students

Implementation Plan (timeline): Online

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for Alumni Survey Summary of Findings: The survey of alumni was initiated in spring 2008, and was planned to take place each spring semester. It was planned that graduates would be surveyed three years after graduation to give them a chance to become fully functioning Computer Science professionals, and each graduate would be surveyed only once. The survey conducted spring 2008, of 2005 graduates, was to serve as a baseline against which future results would be compared. Many of the same questions (e.g., regarding successful fulfillment of Outcomes) were asked on both the alumni survey and the senior survey. However, it was thought that the three years of professional practice between the two should give the graduate a more mature perspective, and a much more realistic assessment of how well this program actually prepared him or her for the profession – for this reason, the alumni survey would be useful in assessing fulfillment of both the Outcomes and the Objective of the program. The survey also included questions regarding such issues as continuing education, what the graduate now most wishes had been taught, commitment to lifelong learning (which is especially critical in the Computer Science profession) and strengths and weaknesses of the program. Some objective information, such as rates of graduate school acceptance and increasing levels of responsibility of alumni, could also be collected using this instrument. The surveys would also be used in the periodic assessment of the mission and objective, and in determining how well our outcomes imply our objective of preparing the student for successful practice of the Computer Science profession. As with the senior survey, in 2009, the

Page 3 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 273: Completed Academic Program Assessment Plans 20082009

survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation, and comparison of results from multiple years.

Target Achievement: Met

Recommendations : Recommendations : This survey will continue to be refined, and will continue to be administered each spring. The initial plan was for each student to be surveyed only once, three years after graduation, but with the web application, alumni from more than one graduation year responded. How this should be handled will be addressed by the Assessment Committee in the 2009-10 school year.

Notes :

Substantiating Evidence:

Insdustrial Advisory Board/Employers Program level; Indirect - Focus Group

Details/Description: Informal industry and employer interactions and inputs have always been part of the program. These encounters include informal focus groups with a few faculty members, faculty consulting, student internships, IEEE participation, and the College of Engineering and Computer Science Industrial Advisory Board (IAB). The College Industrial Advisory Board has been in place for many years, but is more focused on Engineering than Computer Science. The CSE department has established a departmental Industrial Advisory Board to provide more specific feedback on the needs of local industry related to our programs and the performance of our graduates whom they have employed

Target: Program

Implementation Plan (timeline): Twice a year

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for Insdustrial Advisory Board/Employers Summary of Findings: The board was established in fall 2007, and the first meeting of the board took place on January 24, 2008, and at least one meeting has taken place in each semester since that time. A prйcis of the board meeting is included in the “Assessment Results” section of Appendix B-4-3. All minutes of the meetings with the advisory board will be available for inspection by the members of the ABET team at the time of the fall visit. The evolution of the department assessment processes has led to the establishment of Reading Day, the day between the last day of classes and the first day of exams in each semester, as the regularly scheduled meeting of the Advisory Board with the department faculty, giving two regularly scheduled meetings each school year. However, in the event of special needs, additional meetings with the board will be scheduled. The members of the advisory board have demonstrated a strong commitment to supporting our programs now and in the future.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Program Rubrics Program level; Direct - Student Artifact

Details/Description: Rubrics were identified for each A-K/L Outcome by the faculty. Each rubric is designed to assess one program outcome A-K/L and the extent that each program outcome was met was measured based on the course outcome competencies listed in each program outcome.

Target: All students

Implementation Plan (timeline): Every Semester

Key/Responsible Personnel: DH/Assessment Coordinator

Supporting Attachments:

Findings for Program Rubrics Summary of Findings: The course student surveys allow us to capture the students’ input into the assessment process. The course artifacts provide the instructor’s measure of student competencies. The assessment team’s evaluation of the course artifacts and evaluation of each Outcome A-K/L gives us a measure of the effectiveness of our instruction and the extent to which students meet the set objectives and program outcomes.

Target Achievement: Exceeded

Recommendations : Needs fine tuning.

Notes :

Substantiating Evidence:

Senior Survey Program level; Indirect - Survey

Details/Description: A survey of seniors in the CSE concentrations was developed to be given each spring in

Findings for Senior Survey Summary of Findings: A survey of seniors in the CS concentrations was developed to be given each spring in

Page 4 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 274: Completed Academic Program Assessment Plans 20082009

CPSC 490, a course that all senior CSE students must take as a capstone. The survey is administered through the class Blackboard site, assuring that each student takes the survey once. The survey consists of the Outcome competencies that broadly measure the effectiveness of our program. Each survey question is mapped to one or more competencies. The survey questions and the mapping of the questions to Outcome competencies is given in Appendix E. The results of the survey are used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester.

Target: Senior Students CPSC 490 -

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: Deaprtment Head/Assessment Coordinator

Supporting Attachments:

CPSC 490, a course that all senior CS students must take as a capstone, ensuring that each student is surveyed once and only once. The plan was that after the instructor of CPSC 490 administered the survey, the Assessment Committee would perform an initial collation and evaluation of the results, which would then be discussed each fall in the pre-semester retreat for all CSE faculty. Changes, additions, and emphases of the program may be altered based on these results. The form was designed to make it numerically scorable, and therefore facilitate the evaluation of trends, areas needing improvement, and effects of changes from year to year. The Outcomes section of the survey has both “scored” responses and space for seniors to comment. All surveys and results will be available for inspection by future accreditation teams. A copy of the survey given to seniors in Spring 2008 is included in Appendix B-4-6-b, along with graphs of the results of the 2008 and 2009 administrations of the survey. In 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation and comparison of results from multiple years. Results from both 2008 and 2009 surveys are discussed in the “Assessment Results” section of Appendix B-4-3. This survey will continue to be refined, and will continue to be administered in the senior capstone course each spring, to make sure that each student takes it only once, and when he or she is close to graduation.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Outcome C

Mapped to: No Mapping

Measures & Findings

Alumni Survey Program level; Indirect - Survey

Details/Description: The survey of alumni is the primary means of measuring the opinions of the alumni on how well the outcomes were achieved at the time of graduation, from a more mature perspective, provide a valuable indicator of outcomes as well. The survey is a contneous online survey with enough security parameters to allow only legitimate alumni. Target: Program Alumni

Implementation Plan (timeline): Online (ongoing)

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for Alumni Survey Summary of Findings: The survey of alumni was initiated in spring 2008, and was planned to take place each spring semester. It was planned that graduates would be surveyed three years after graduation to give them a chance to become fully functioning Computer Science professionals, and each graduate would be surveyed only once. The survey conducted spring 2008, of 2005 graduates, was to serve as a baseline against which future results would be compared. Many of the same questions (e.g., regarding successful fulfillment of Outcomes) were asked on both the alumni survey and the senior survey. However, it was thought that the three years of professional practice between the two should give the graduate a more mature perspective, and a much more realistic assessment of how well this program actually prepared him or her for the profession – for this reason, the alumni survey would be useful in assessing fulfillment of both the Outcomes and the Objective of the program. The survey also included questions regarding such issues as continuing education, what the graduate now most wishes had been taught, commitment to lifelong learning (which is especially critical in the Computer Science profession) and strengths and weaknesses of the program. Some objective information, such as rates of graduate school acceptance and increasing levels of responsibility of alumni, could also be collected using this instrument. The surveys would also be used in the periodic assessment of the mission and objective, and in determining how well our outcomes imply our objective of preparing the

Page 5 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 275: Completed Academic Program Assessment Plans 20082009

student for successful practice of the Computer Science profession. As with the senior survey, in 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation, and comparison of results from multiple years. Target Achievement: Met

Recommendations : Recommendations : Recommendations : This survey will continue to be refined, and will continue to be administered each spring. The initial plan was for each student to be surveyed only once, three years after graduation, but with the web application, alumni from more than one graduation year responded. How this should be handled will be addressed by the Assessment Committee in the 2009-10 school year. Notes :

Substantiating Evidence:

Course Objects/Student Survey Course level; Indirect - Survey

Details/Description: For each core course a team composed of those faculty who have special expertise with the course (they have taught the course or subsequent courses that have prerequisite competencies from the course) prepares a list of course objectives and a student survey designed to gauge the students’ perception of their attainment of the course objectives. The team identifies the mapping of course objectives to Outcome competencies. The team reviews the course objectives and mapping each spring or more often if review is requested by faculty teaching the course. Target: All Students

Implementation Plan (timeline): Every Semester

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for Course Objects/Student Survey Summary of Findings: The results of the student survey and portfolios of typical student work are included with the course report. The portfolios consists of all of the exercises and exams for at least three students – one excellent (A) student, one average (B-C) student and one poor (D-F) student – is included in the course report. The student survey results and portfolios in the course reports are used in the course review and evaluation of the appropriate competencies for Program Outcomes A-K/L.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

MFT Program level; Direct - Exam

Details/Description: The Major Field Test is given to seniors every two years. The results from the three subsections (Programming, Discrete Structures and Algorithms and Systems) are also used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester. Target: Program

Implementation Plan (timeline): Every 3 years

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for MFT Summary of Findings: The Computer Science Major Field Test was developed by the Educational Testing Service, the nonprofit organization behind such seminal, well-respected exams as the GRE, the SAT, and PRAXIS. The computer science test is a two-hour, multiple choice examination designed to test content knowledge in the areas of discrete structures (at least 15 percent), programming (at least 21 percent), algorithms and complexity (at least 16 percent), systems (at least 16 percent), software engineering (at least 3 percent), information management (at least 3 percent), and other topics such as intelligent systems, professional issues, and human computer interaction. Results are provided for individual test takers and for the university computer science program as a whole. Assessment indicators, in the form of average percentage of correct answers in particular content areas, supply feedback as to whether students in a program are having difficulty in a given subject area. In addition, a Comparative Data Guide is generated each year, which allows a program to compare its performance to that of the numerous other programs, ranging from small liberal arts college to large state research universities, which participate in the exam nationally. Information on the MFT is taken from www.ets.org, and

Page 6 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 276: Completed Academic Program Assessment Plans 20082009

more information may be found at this site. Due to cost constraints, this test was formerly given to graduating seniors only once every five years. However, due to a recognition of the need for more objective measurement of how well students in our program achieve outcomes and compare to those at other universities, it was agreed in 2008 that this test would be taken by graduating seniors every second year, beginning Spring 2008. The test covers Programming Fundamentals, Computer Organization and Architectures, Operating Systems, Algorithms, Computer Science Theory, and Computational Mathematics. Results of the test in the past have been used as an informal measure of program success, but no formal process was in place. For evaluation of results, a process was proposed by which the Assessment Committee would compare our students’ scores, as a group, to the national averages using a comparison of the number of standard deviations away from the national averages our students were in different categories. Any areas in which our students scored more than one standard deviation below the national mean would be brought to the attention of the full CSE faculty at the annual fall retreat for discussion. As the number of students currently in the program is small, setting percentage goals for scores at this point was thought to be impractical. However, each year, the evaluation results were to be examined for trends and evidence of weaknesses in the program by the Assessment Committee, and appropriate recommendations will be made to the CSE faculty. As the number of students in the program increases, use of numerical targets would be reexamined.

Target Achievement: Met

Recommendations : Results of the test are given and compared to previous results

Notes :

Substantiating Evidence:

Program Rubrics Program level; Direct - Student Artifact

Details/Description: Rubrics were identified for each A-K/L Outcome by the faculty. Each rubric is designed to assess one program outcome A-K/L and the extent that each program outcome was met was measured based on the course outcome competencies listed in each program outcome. Target: Student work

Implementation Plan (timeline): Every semester

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for Program Rubrics Summary of Findings: The course student surveys allow us to capture the students’ input into the assessment process. The course artifacts provide the instructor’s measure of student competencies. The assessment team’s evaluation of the course artifacts and evaluation of each Outcome A-K/L gives us a measure of the effectiveness of our instruction and the extent to which students meet the set objectives and program outcomes.

Target Achievement: Met

Recommendations : Needs fine tuning.

Notes :

Substantiating Evidence:

Senior Survey Program level; Indirect - Survey

Details/Description: A survey of seniors in the CSE concentrations was developed to be given each spring in CPSC 490, a course that all senior CSE students must take as a capstone. The survey is administered through the class Blackboard site, assuring that each student takes the survey once. The survey consists of the Outcome competencies that broadly measure the effectiveness of our program. Each survey question is mapped to one or more competencies. The survey questions and the mapping of the questions to Outcome competencies is given in Appendix E. The results of the survey are used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester. Target: Students

Findings for Senior Survey Summary of Findings: A survey of seniors in the CS concentrations was developed to be given each spring in CPSC 490, a course that all senior CS students must take as a capstone, ensuring that each student is surveyed once and only once. The plan was that after the instructor of CPSC 490 administered the survey, the Assessment Committee would perform an initial collation and evaluation of the results, which would then be discussed each fall in the pre-semester retreat for all CSE faculty. Changes, additions, and emphases of the program may be altered based on these results. The form was designed to make it numerically scorable, and therefore facilitate the evaluation of trends, areas needing improvement, and effects of changes from year to year. The Outcomes section of the survey has both “scored” responses and space for seniors to comment. In

Page 7 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 277: Completed Academic Program Assessment Plans 20082009

Target: Students

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation and comparison of results from multiple years.

Target Achievement:

Recommendations : This survey will continue to be refined, and will continue to be administered in the senior capstone course each spring, to make sure that each student takes it only once, and when he or she is close to graduation.

Notes :

Substantiating Evidence:

Outcome D

Mapped to: No Mapping

Measures & Findings

Course Objects/Student Survey Program level; Indirect - Survey

Details/Description: For each core course a team composed of those faculty who have special expertise with the course (they have taught the course or subsequent courses that have prerequisite competencies from the course) prepares a list of course objectives and a student survey designed to gauge the students’ perception of their attainment of the course objectives. The team identifies the mapping of course objectives to Outcome competencies. The team reviews the course objectives and mapping each spring or more often if review is requested by faculty teaching the course. Target: Students

Implementation Plan (timeline): Every semester

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for Course Objects/Student Survey Summary of Findings: The results of the student survey and portfolios of typical student work are included with the course report. The portfolios consists of all of the exercises and exams for at least three students – one excellent (A) student, one average (B-C) student and one poor (D-F) student – is included in the course report. The student survey results and portfolios in the course reports are used in the course review and evaluation of the appropriate competencies for Program Outcomes A-K/L.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Industrial Advisory Board/Employers Contacts Program level; Indirect - Focus Group

Details/Description: Informal industry and employer interactions and inputs have always been part of the program. These encounters include informal focus groups with a few faculty members, faculty consulting, student internships, IEEE participation, and the College of Engineering and Computer Science Industrial Advisory Board (IAB). The College Industrial Advisory Board has been in place for many years, but is more focused on Engineering than Computer Science. The CSE department has established a departmental Industrial Advisory Board to provide more specific feedback on the needs of local industry related to our programs and the performance of our graduates whom they have employed. Target: Program

Implementation Plan (timeline): Twice a year

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for Industrial Advisory Board/Employers Contacts Summary of Findings: The board was established in fall 2007, and the first meeting of the board took place on January 24, 2008, and at least one meeting has taken place in each semester since that time. A prйcis of the board meeting is included in the “Assessment Results” section of Appendix B-4-3. All minutes of the meetings with the advisory board will be available for inspection by the members of the ABET team at the time of the fall visit. The evolution of the department assessment processes has led to the establishment of Reading Day, the day between the last day of classes and the first day of exams in each semester, as the regularly scheduled meeting of the Advisory Board with the department faculty, giving two regularly scheduled meetings each school year. However, in the event of special needs, additional meetings with the board will be scheduled. The members of the advisory board have demonstrated a strong commitment to supporting our programs now and in the future.

Target Achievement: Met

Recommendations : More informal contacts

Notes :

Substantiating Evidence:

Program Rubrics Findings for Program Rubrics

Page 8 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 278: Completed Academic Program Assessment Plans 20082009

Program level; Direct - Student Artifact

Details/Description: Rubrics were identified for each A-K/L Outcome by the faculty. Each rubric is designed to assess one program outcome A-K/L and the extent that each program outcome was met was measured based on the course outcome competencies listed in each program outcome. Target: Students

Implementation Plan (timeline): Once a year - Every Spring

Key/Responsible Personnel: dept. Head

Supporting Attachments:

Summary of Findings: The course student surveys allow us to capture the students’ input into the assessment process. The course artifacts provide the instructor’s measure of student competencies. The assessment team’s evaluation of the course artifacts and evaluation of each Outcome A-K/L gives us a measure of the effectiveness of our instruction and the extent to which students meet the set objectives and program outcomes.

Target Achievement: Met

Recommendations : needs fine tuning.

Notes :

Substantiating Evidence:

Senior Survey Indirect - Survey

Details/Description: A survey of seniors in the CSE concentrations was developed to be given each spring in CPSC 490, a course that all senior CSE students must take as a capstone. The survey is administered through the class Blackboard site, assuring that each student takes the survey once. The survey consists of the Outcome competencies that broadly measure the effectiveness of our program. Each survey question is mapped to one or more competencies. The survey questions and the mapping of the questions to Outcome competencies is given in Appendix E. The results of the survey are used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester. Target: Senior Students

Implementation Plan (timeline): Once a Year - CPSC 490

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for Senior Survey Summary of Findings: A survey of seniors in the CS concentrations was developed to be given each spring in CPSC 490, a course that all senior CS students must take as a capstone, ensuring that each student is surveyed once and only once. The plan was that after the instructor of CPSC 490 administered the survey, the Assessment Committee would perform an initial collation and evaluation of the results, which would then be discussed each fall in the pre-semester retreat for all CSE faculty. Changes, additions, and emphases of the program may be altered based on these results. The form was designed to make it numerically scorable, and therefore facilitate the evaluation of trends, areas needing improvement, and effects of changes from year to year. The Outcomes section of the survey has both “scored” responses and space for seniors to comment. All surveys and results will be available for inspection by future accreditation teams. In 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation and comparison of results from multiple years.

Target Achievement: Met

Recommendations : This survey will continue to be refined, and will continue to be administered in the senior capstone course each spring, to make sure that each student takes it only once, and when he or she is close to graduation.

Notes :

Substantiating Evidence:

Outcome E

Mapped to: No Mapping

Measures & Findings

Alumni Survey Program level; Indirect - Survey

Details/Description: The survey of alumni is the primary means of measuring the opinions of the alumni on how well the outcomes were achieved at the time of graduation, from a more mature perspective, provide a valuable indicator of outcomes as well. The survey is a contneous online survey with enough security parameters to allow only legitimate alumni. Target: Program alumni

Implementation Plan (timeline): Online - ongoing

Key/Responsible Personnel: Assessment committee

Supporting Attachments:

Findings for Alumni Survey Summary of Findings: The survey of alumni was initiated in spring 2008, and was planned to take place each spring semester. It was planned that graduates would be surveyed three years after graduation to give them a chance to become fully functioning Computer Science professionals, and each graduate would be surveyed only once. The survey conducted spring 2008, of 2005 graduates, was to serve as a baseline against which future results would be compared. Many of the same questions (e.g., regarding successful fulfillment of Outcomes) were asked on both the alumni survey and the senior survey. However, it was thought that the three years of professional practice between the two should give the graduate a more mature perspective,

Page 9 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 279: Completed Academic Program Assessment Plans 20082009

and a much more realistic assessment of how well this program actually prepared him or her for the profession – for this reason, the alumni survey would be useful in assessing fulfillment of both the Outcomes and the Objective of the program. The survey also included questions regarding such issues as continuing education, what the graduate now most wishes had been taught, commitment to lifelong learning (which is especially critical in the Computer Science profession) and strengths and weaknesses of the program. Some objective information, such as rates of graduate school acceptance and increasing levels of responsibility of alumni, could also be collected using this instrument. The surveys would also be used in the periodic assessment of the mission and objective, and in determining how well our outcomes imply our objective of preparing the student for successful practice of the Computer Science profession. As with the senior survey, in 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation, and comparison of results from multiple years.

Target Achievement:

Recommendations : This survey will continue to be refined, and will continue to be administered each spring. The initial plan was for each student to be surveyed only once, three years after graduation, but with the web application, alumni from more than one graduation year responded. How this should be handled will be addressed by the Assessment Committee in the 2009-10 school year.

Notes :

Substantiating Evidence:

Course Objects/Student Survey Program level; Indirect - Survey

Details/Description: For each core course a team composed of those faculty who have special expertise with the course (they have taught the course or subsequent courses that have prerequisite competencies from the course) prepares a list of course objectives and a student survey designed to gauge the students’ perception of their attainment of the course objectives. The team identifies the mapping of course objectives to Outcome competencies. The team reviews the course objectives and mapping each spring or more often if review is requested by faculty teaching the course. Target: Students

Implementation Plan (timeline): Every semester

Key/Responsible Personnel: Instructor

Supporting Attachments:

Findings for Course Objects/Student Survey Summary of Findings: The results of the student survey and portfolios of typical student work are included with the course report. The portfolios consists of all of the exercises and exams for at least three students – one excellent (A) student, one average (B-C) student and one poor (D-F) student – is included in the course report. The student survey results and portfolios in the course reports are used in the course review and evaluation of the appropriate competencies for Program Outcomes A-K/L.

Target Achievement: Exceeded

Recommendations :

Notes :

Substantiating Evidence:

Program Rubrics Program level; Direct - Student Artifact

Details/Description: Rubrics were identified for each A-K/L Outcome by the faculty. Each rubric is designed to assess one program outcome A-K/L and the extent that each program outcome was met was measured based on the course outcome competencies listed in each program outcome. Target: Student work

Implementation Plan (timeline): Once a year - Spring Semester

Findings for Program Rubrics Summary of Findings: The course student surveys allow us to capture the students’ input into the assessment process. The course artifacts provide the instructor’s measure of student competencies. The assessment team’s evaluation of the course artifacts and evaluation of each Outcome A-K/L gives us a measure of the effectiveness of our instruction and the extent to which students meet the set objectives and program outcomes.

Target Achievement: Exceeded

Page 10 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 280: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: Faculty

Supporting Attachments:

Recommendations : Needs fine tuning.

Notes :

Substantiating Evidence:

Senior Survey Program level; Indirect - Survey

Details/Description: A survey of seniors in the CSE concentrations was developed to be given each spring in CPSC 490, a course that all senior CSE students must take as a capstone. The survey is administered through the class Blackboard site, assuring that each student takes the survey once. The survey consists of the Outcome competencies that broadly measure the effectiveness of our program. Each survey question is mapped to one or more competencies. The survey questions and the mapping of the questions to Outcome competencies is given in Appendix E. The results of the survey are used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester. Target: Students

Implementation Plan (timeline): Once a year when CPSC 490 is taught

Key/Responsible Personnel: Instructor

Supporting Attachments:

Findings for Senior Survey Summary of Findings: A survey of seniors in the CS concentrations was developed to be given each spring in CPSC 490, a course that all senior CS students must take as a capstone, ensuring that each student is surveyed once and only once. The plan was that after the instructor of CPSC 490 administered the survey, the Assessment Committee would perform an initial collation and evaluation of the results, which would then be discussed each fall in the pre-semester retreat for all CSE faculty. Changes, additions, and emphases of the program may be altered based on these results. The form was designed to make it numerically scorable, and therefore facilitate the evaluation of trends, areas needing improvement, and effects of changes from year to year. The Outcomes section of the survey has both “scored” responses and space for seniors to comment. All surveys and results will be available for inspection by future accreditation teams. A copy of the survey given to seniors in Spring 2008 is included in Appendix B-4-6-b, along with graphs of the results of the 2008 and 2009 administrations of the survey. In 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation and comparison of results from multiple years. Results from both 2008 and 2009 surveys are discussed in the “Assessment Results” section of Appendix B-4-3. This survey will continue to be refined, and will continue to be administered in the senior capstone course each spring, to make sure that each student takes it only once, and when he or she is close to graduation.

Target Achievement: Exceeded

Recommendations : Needs refining.

Notes :

Substantiating Evidence:

Outcome F

Mapped to: No Mapping

Measures & Findings

Alumni Survey Program level; Indirect - Survey

Details/Description: The survey of alumni is the primary means of measuring the opinions of the alumni on how well the outcomes were achieved at the time of graduation, from a more mature perspective, provide a valuable indicator of outcomes as well. The survey is a contneous online survey with enough security parameters to allow only legitimate alumni. Target: Program alumni

Implementation Plan (timeline): Online - ongoing.

Key/Responsible Personnel: Assessment committee

Supporting Attachments:

Findings for Alumni Survey Summary of Findings: The survey of alumni was initiated in spring 2008, and was planned to take place each spring semester. It was planned that graduates would be surveyed three years after graduation to give them a chance to become fully functioning Computer Science professionals, and each graduate would be surveyed only once. The survey conducted spring 2008, of 2005 graduates, was to serve as a baseline against which future results would be compared. Many of the same questions (e.g., regarding successful fulfillment of Outcomes) were asked on both the alumni survey and the senior survey. However, it was thought that the three years of professional practice between the two should give the graduate a more mature perspective, and a much more realistic assessment of how well this program actually prepared him or her for the profession – for this reason, the alumni survey would be useful in assessing fulfillment of both the Outcomes and the Objective of the program. The survey also included questions regarding such issues as continuing education, what the graduate now most wishes had been taught,

Page 11 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 281: Completed Academic Program Assessment Plans 20082009

commitment to lifelong learning (which is especially critical in the Computer Science profession) and strengths and weaknesses of the program. Some objective information, such as rates of graduate school acceptance and increasing levels of responsibility of alumni, could also be collected using this instrument. The surveys would also be used in the periodic assessment of the mission and objective, and in determining how well our outcomes imply our objective of preparing the student for successful practice of the Computer Science profession. As with the senior survey, in 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation, and comparison of results from multiple years. Results: Target Achievement: Met Recommendations : Recommendations : This survey will continue to be refined, and will continue to be administered each spring. The initial plan was for each student to be surveyed only once, three years after graduation, but with the web application, alumni from more than one graduation year responded. How this should be handled will be addressed by the Assessment Committee in the 2009-10 school year. Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Course Objects/Student Survey Program level; Indirect - Survey

Details/Description: For each core course a team composed of those faculty who have special expertise with the course (they have taught the course or subsequent courses that have prerequisite competencies from the course) prepares a list of course objectives and a student survey designed to gauge the students’ perception of their attainment of the course objectives. The team identifies the mapping of course objectives to Outcome competencies. The team reviews the course objectives and mapping each spring or more often if review is requested by faculty teaching the course. Target: Students

Implementation Plan (timeline): Every semester

Key/Responsible Personnel: Instructor

Supporting Attachments:

Findings for Course Objects/Student Survey Summary of Findings: The results of the student survey and portfolios of typical student work are included with the course report. The portfolios consists of all of the exercises and exams for at least three students – one excellent (A) student, one average (B-C) student and one poor (D-F) student – is included in the course report. The student survey results and portfolios in the course reports are used in the course review and evaluation of the appropriate competencies for Program Outcomes A-K/L.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Industrial Advisory Board/Employers Contacts survey/ Program level; Indirect - Focus Group

Details/Description: Informal industry and employer interactions and inputs have always been part of the program. These encounters include informal focus groups with a few faculty members, faculty consulting, student internships, IEEE participation, and the College of Engineering and Computer Science Industrial Advisory Board (IAB). The College Industrial Advisory Board has been in place for many years, but is more focused on Engineering than Computer Science. The CSE department has established a departmental Industrial Advisory Board to provide more specific feedback on the needs of local industry related to our programs and the performance of our graduates whom they have employed.

Findings for Industrial Advisory Board/Employers Contacts survey/ Summary of Findings: The board was established in fall 2007, and the first meeting of the board took place on January 24, 2008, and at least one meeting has taken place in each semester since that time. A prйcis of the board meeting is included in the “Assessment Results” section of Appendix B-4-3. All minutes of the meetings with the advisory board will be available for inspection by the members of the ABET team at the time of the fall visit. The evolution of the department assessment processes has led to the establishment of Reading Day, the day between the last day of classes and the first day of exams in each semester, as the regularly scheduled meeting of the Advisory Board with the department

Page 12 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 282: Completed Academic Program Assessment Plans 20082009

Target: Program

Implementation Plan (timeline): Twice a year

Key/Responsible Personnel: Head of Department

Supporting Attachments:

faculty, giving two regularly scheduled meetings each school year. However, in the event of special needs, additional meetings with the board will be scheduled. The members of the advisory board have demonstrated a strong commitment to supporting our programs now and in the future.

Target Achievement: Met

Recommendations : More informal meetings

Notes :

Substantiating Evidence:

Program Rubrics Program level; Direct - Student Artifact

Details/Description: Rubrics were identified for each A-K/L Outcome by the faculty. Each rubric is designed to assess one program outcome A-K/L and the extent that each program outcome was met was measured based on the course outcome competencies listed in each program outcome. Target: Student work

Implementation Plan (timeline): Once a year - Spring Semester

Key/Responsible Personnel: Faculty

Supporting Attachments:

Findings for Program Rubrics Summary of Findings: The course student surveys allow us to capture the students’ input into the assessment process. The course artifacts provide the instructor’s measure of student competencies. The assessment team’s evaluation of the course artifacts and evaluation of each Outcome A-K/L gives us a measure of the effectiveness of our instruction and the extent to which students meet the set objectives and program outcomes.

Target Achievement: Met

Recommendations : needs fine tuning,

Notes :

Substantiating Evidence:

Senior Survey Program level; Indirect - Survey

Details/Description: A survey of seniors in the CSE concentrations was developed to be given each spring in CPSC 490, a course that all senior CSE students must take as a capstone. The survey is administered through the class Blackboard site, assuring that each student takes the survey once. The survey consists of the Outcome competencies that broadly measure the effectiveness of our program. Each survey question is mapped to one or more competencies. The survey questions and the mapping of the questions to Outcome competencies is given in Appendix E. The results of the survey are used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester. Target: Students

Implementation Plan (timeline): Once a year when CPSC 490 is taught.

Key/Responsible Personnel: Instructor

Supporting Attachments:

Findings for Senior Survey Summary of Findings: A survey of seniors in the CS concentrations was developed to be given each spring in CPSC 490, a course that all senior CS students must take as a capstone, ensuring that each student is surveyed once and only once. The plan was that after the instructor of CPSC 490 administered the survey, the Assessment Committee would perform an initial collation and evaluation of the results, which would then be discussed each fall in the pre-semester retreat for all CSE faculty. Changes, additions, and emphases of the program may be altered based on these results. The form was designed to make it numerically scorable, and therefore facilitate the evaluation of trends, areas needing improvement, and effects of changes from year to year. The Outcomes section of the survey has both “scored” responses and space for seniors to comment. All surveys and results will be available for inspection by future accreditation teams. A copy of the survey given to seniors in Spring 2008 is included in Appendix B-4-6-b, along with graphs of the results of the 2008 and 2009 administrations of the survey. In 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation and comparison of results from multiple years. Results from both 2008 and 2009 surveys are discussed in the “Assessment Results” section of Appendix B-4-3. This survey will continue to be refined, and will continue to be administered in the senior capstone course each spring, to make sure that each student takes it only once, and when he or she is close to graduation.

Target Achievement: Met

Recommendations : Needs more fine tuning.

Notes :

Substantiating Evidence:

Outcome G

Page 13 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 283: Completed Academic Program Assessment Plans 20082009

Mapped to: No Mapping

Measures & Findings

Course Objects/Student Survey Program level; Indirect - Survey

Details/Description: For each core course a team composed of those faculty who have special expertise with the course (they have taught the course or subsequent courses that have prerequisite competencies from the course) prepares a list of course objectives and a student survey designed to gauge the students’ perception of their attainment of the course objectives. The team identifies the mapping of course objectives to Outcome competencies. The team reviews the course objectives and mapping each spring or more often if review is requested by faculty teaching the course. Target: Students

Implementation Plan (timeline): Every semester

Key/Responsible Personnel: Instructor

Supporting Attachments:

Findings for Course Objects/Student Survey Summary of Findings: The results of the student survey and portfolios of typical student work are included with the course report. The portfolios consists of all of the exercises and exams for at least three students – one excellent (A) student, one average (B-C) student and one poor (D-F) student – is included in the course report. The student survey results and portfolios in the course reports are used in the course review and evaluation of the appropriate competencies for Program Outcomes A-K/L.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Program Rubrics Program level; Direct - Student Artifact

Details/Description: Rubrics were identified for each A-K/L Outcome by the faculty. Each rubric is designed to assess one program outcome A-K/L and the extent that each program outcome was met was measured based on the course outcome competencies listed in each program outcome. Target: Student work

Implementation Plan (timeline): Once a year - Spring Semester

Key/Responsible Personnel: Faculty

Supporting Attachments:

Findings for Program Rubrics Summary of Findings: The course student surveys allow us to capture the students’ input into the assessment process. The course artifacts provide the instructor’s measure of student competencies. The assessment team’s evaluation of the course artifacts and evaluation of each Outcome A-K/L gives us a measure of the effectiveness of our instruction and the extent to which students meet the set objectives and program outcomes.

Target Achievement: Met

Recommendations : Needs fine tuning.

Notes :

Substantiating Evidence:

Senior Survey Program level; Indirect - Survey

Details/Description: Details/Description: A survey of seniors in the CSE concentrations was developed to be given each spring in CPSC 490, a course that all senior CSE students must take as a capstone. The survey is administered through the class Blackboard site, assuring that each student takes the survey once. The survey consists of the Outcome competencies that broadly measure the effectiveness of our program. Each survey question is mapped to one or more competencies. The survey questions and the mapping of the questions to Outcome competencies is given in Appendix E. The results of the survey are used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester.

Target: Students

Implementation Plan (timeline): Once a year when CPSC 490 is taught

Key/Responsible Personnel: Instructor

Supporting Attachments:

Findings for Senior Survey Summary of Findings: A survey of seniors in the CS concentrations was developed to be given each spring in CPSC 490, a course that all senior CS students must take as a capstone, ensuring that each student is surveyed once and only once. The plan was that after the instructor of CPSC 490 administered the survey, the Assessment Committee would perform an initial collation and evaluation of the results, which would then be discussed each fall in the pre-semester retreat for all CSE faculty. Changes, additions, and emphases of the program may be altered based on these results. The form was designed to make it numerically scorable, and therefore facilitate the evaluation of trends, areas needing improvement, and effects of changes from year to year. The Outcomes section of the survey has both “scored” responses and space for seniors to comment. All surveys and results will be available for inspection by future accreditation teams. A copy of the survey given to seniors in Spring 2008 is included in Appendix B-4-6-b, along with graphs of the results of the 2008 and 2009 administrations of the survey. In 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation and comparison of results from multiple years. Results from both 2008 and 2009 surveys are discussed in the “Assessment Results” section of Appendix B-4-3. This survey will continue to be refined, and will continue to be

Page 14 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 284: Completed Academic Program Assessment Plans 20082009

administered in the senior capstone course each spring, to make sure that each student takes it only once, and when he or she is close to graduation.

Target Achievement: Exceeded

Recommendations : Needs more fine tuning.

Notes :

Substantiating Evidence:

Outcome H

Mapped to: No Mapping

Measures & Findings

Alumni Survey Program level; Indirect - Survey

Details/Description: The survey of alumni is the primary means of measuring the opinions of the alumni on how well the outcomes were achieved at the time of graduation, from a more mature perspective, provide a valuable indicator of outcomes as well. The survey is a contneous online survey with enough security parameters to allow only legitimate alumni.

Target: Program alumni

Implementation Plan (timeline): Online - ongoing.

Key/Responsible Personnel: Assessment committee

Supporting Attachments:

Findings for Alumni Survey Summary of Findings: The survey of alumni was initiated in spring 2008, and was planned to take place each spring semester. It was planned that graduates would be surveyed three years after graduation to give them a chance to become fully functioning Computer Science professionals, and each graduate would be surveyed only once. The survey conducted spring 2008, of 2005 graduates, was to serve as a baseline against which future results would be compared. Many of the same questions (e.g., regarding successful fulfillment of Outcomes) were asked on both the alumni survey and the senior survey. However, it was thought that the three years of professional practice between the two should give the graduate a more mature perspective, and a much more realistic assessment of how well this program actually prepared him or her for the profession – for this reason, the alumni survey would be useful in assessing fulfillment of both the Outcomes and the Objective of the program. The survey also included questions regarding such issues as continuing education, what the graduate now most wishes had been taught, commitment to lifelong learning (which is especially critical in the Computer Science profession) and strengths and weaknesses of the program. Some objective information, such as rates of graduate school acceptance and increasing levels of responsibility of alumni, could also be collected using this instrument. The surveys would also be used in the periodic assessment of the mission and objective, and in determining how well our outcomes imply our objective of preparing the student for successful practice of the Computer Science profession. As with the senior survey, in 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation, and comparison of results from multiple years. Target Achievement: Met

Recommendations : Recommendations : Recommendations : This survey will continue to be refined, and will continue to be administered each spring. The initial plan was for each student to be surveyed only once, three years after graduation, but with the web application, alumni from more than one graduation year responded. How this should be handled will be addressed by the Assessment Committee in the 2009-10 school year. Notes :

Substantiating Evidence:

Program Rubrics Program level; Direct - Student Artifact

Details/Description: Rubrics were identified for each

Findings for Program Rubrics Summary of Findings: The course student surveys

Page 15 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 285: Completed Academic Program Assessment Plans 20082009

A-K/L Outcome by the faculty. Each rubric is designed to assess one program outcome A-K/L and the extent that each program outcome was met was measured based on the course outcome competencies listed in each program outcome.

Target: Student work

Implementation Plan (timeline): Once a year - Spring Semester

Key/Responsible Personnel: Faculty

Supporting Attachments:

allow us to capture the students’ input into the assessment process. The course artifacts provide the instructor’s measure of student competencies. The assessment team’s evaluation of the course artifacts and evaluation of each Outcome A-K/L gives us a measure of the effectiveness of our instruction and the extent to which students meet the set objectives and program outcomes.

Target Achievement: Met

Recommendations : Needs fine tuning.

Notes :

Substantiating Evidence:

Senior Survey Program level; Indirect - Survey

Details/Description: A survey of seniors in the CSE concentrations was developed to be given each spring in CPSC 490, a course that all senior CSE students must take as a capstone. The survey is administered through the class Blackboard site, assuring that each student takes the survey once. The survey consists of the Outcome competencies that broadly measure the effectiveness of our program. Each survey question is mapped to one or more competencies. The survey questions and the mapping of the questions to Outcome competencies is given in Appendix E. The results of the survey are used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester.

Target: Students

Implementation Plan (timeline): Once a year when CPSC 490 is taught

Key/Responsible Personnel: Instructor

Supporting Attachments:

Findings for Senior Survey Summary of Findings: A survey of seniors in the CS concentrations was developed to be given each spring in CPSC 490, a course that all senior CS students must take as a capstone, ensuring that each student is surveyed once and only once. The plan was that after the instructor of CPSC 490 administered the survey, the Assessment Committee would perform an initial collation and evaluation of the results, which would then be discussed each fall in the pre-semester retreat for all CSE faculty. Changes, additions, and emphases of the program may be altered based on these results. The form was designed to make it numerically scorable, and therefore facilitate the evaluation of trends, areas needing improvement, and effects of changes from year to year. The Outcomes section of the survey has both “scored” responses and space for seniors to comment. All surveys and results will be available for inspection by future accreditation teams. A copy of the survey given to seniors in Spring 2008 is included in Appendix B-4-6-b, along with graphs of the results of the 2008 and 2009 administrations of the survey. In 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation and comparison of results from multiple years. Results from both 2008 and 2009 surveys are discussed in the “Assessment Results” section of Appendix B-4-3. This survey will continue to be refined, and will continue to be administered in the senior capstone course each spring, to make sure that each student takes it only once, and when he or she is close to graduation.

Target Achievement: Met

Recommendations : Needs fine tuning.

Notes :

Substantiating Evidence:

Outcome I

Mapped to: No Mapping

Measures & Findings

Alumni Survey Program level; Indirect - Survey

Details/Description: The survey of alumni is the primary means of measuring the opinions of the alumni on how well the outcomes were achieved at the time of graduation, from a more mature perspective, provide a valuable indicator of outcomes as well. The survey is a contneous online survey with enough security parameters to allow only legitimate alumni. Target: Program alumni

Implementation Plan (timeline): Online - ongoing

Findings for Alumni Survey Summary of Findings: The survey of alumni was initiated in spring 2008, and was planned to take place each spring semester. It was planned that graduates would be surveyed three years after graduation to give them a chance to become fully functioning Computer Science professionals, and each graduate would be surveyed only once. The survey conducted spring 2008, of 2005 graduates, was to serve as a baseline against which future results would be compared. Many of the same questions (e.g., regarding successful fulfillment of Outcomes) were asked on both the alumni survey and

Page 16 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 286: Completed Academic Program Assessment Plans 20082009

Key/Responsible Personnel: Assessment committee

Supporting Attachments:

the senior survey. However, it was thought that the three years of professional practice between the two should give the graduate a more mature perspective, and a much more realistic assessment of how well this program actually prepared him or her for the profession – for this reason, the alumni survey would be useful in assessing fulfillment of both the Outcomes and the Objective of the program. The survey also included questions regarding such issues as continuing education, what the graduate now most wishes had been taught, commitment to lifelong learning (which is especially critical in the Computer Science profession) and strengths and weaknesses of the program. Some objective information, such as rates of graduate school acceptance and increasing levels of responsibility of alumni, could also be collected using this instrument. The surveys would also be used in the periodic assessment of the mission and objective, and in determining how well our outcomes imply our objective of preparing the student for successful practice of the Computer Science profession. As with the senior survey, in 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation, and comparison of results from multiple years. Target Achievement: Met

Recommendations : Recommendations : Recommendations : This survey will continue to be refined, and will continue to be administered each spring. The initial plan was for each student to be surveyed only once, three years after graduation, but with the web application, alumni from more than one graduation year responded. How this should be handled will be addressed by the Assessment Committee in the 2009-10 school year. Notes :

Substantiating Evidence:

Course Objects/Student Survey Program level; Indirect - Survey

Details/Description: For each core course a team composed of those faculty who have special expertise with the course (they have taught the course or subsequent courses that have prerequisite competencies from the course) prepares a list of course objectives and a student survey designed to gauge the students’ perception of their attainment of the course objectives. The team identifies the mapping of course objectives to Outcome competencies. The team reviews the course objectives and mapping each spring or more often if review is requested by faculty teaching the course.

Target: Students

Implementation Plan (timeline): Every semester

Key/Responsible Personnel: Instructor

Supporting Attachments:

Findings for Course Objects/Student Survey Summary of Findings: The results of the student survey and portfolios of typical student work are included with the course report. The portfolios consists of all of the exercises and exams for at least three students – one excellent (A) student, one average (B-C) student and one poor (D-F) student – is included in the course report. The student survey results and portfolios in the course reports are used in the course review and evaluation of the appropriate competencies for Program Outcomes A-K/L.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Industrial Advisory Board/Employers Contacts Program level; Indirect - Focus Group

Details/Description: Informal industry and employer interactions and inputs have always been part of the program. These encounters include informal focus groups with a few faculty members, faculty consulting, student internships, IEEE participation, and the College of Engineering and

Findings for Industrial Advisory Board/Employers Contacts Summary of Findings: The board was established in fall 2007, and the first meeting of the board took place on January 24, 2008, and at least one meeting has taken place in each semester since that time. A prйcis of the board meeting is included in the “Assessment Results” section of Appendix B-4-3. All minutes of the meetings with the advisory board will be available for inspection by

Page 17 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 287: Completed Academic Program Assessment Plans 20082009

Computer Science Industrial Advisory Board (IAB). The College Industrial Advisory Board has been in place for many years, but is more focused on Engineering than Computer Science. The CSE department has established a departmental Industrial Advisory Board to provide more specific feedback on the needs of local industry related to our programs and the performance of our graduates whom they have employed. Target: Program

Implementation Plan (timeline): Twice a year

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

the members of the ABET team at the time of the fall visit. The evolution of the department assessment processes has led to the establishment of Reading Day, the day between the last day of classes and the first day of exams in each semester, as the regularly scheduled meeting of the Advisory Board with the department faculty, giving two regularly scheduled meetings each school year. However, in the event of special needs, additional meetings with the board will be scheduled. The members of the advisory board have demonstrated a strong commitment to supporting our programs now and in the future.

Target Achievement: Met

Recommendations : More informal meetings

Notes :

Substantiating Evidence:

MFT Program level; Direct - Exam

Details/Description: The Major Field Test is given to seniors every two years. The results from the three subsections (Programming, Discrete Structures and Algorithms and Systems) are also used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester.

Target: Program

Implementation Plan (timeline): Every 3 years

Key/Responsible Personnel: Head of Department

Supporting Attachments:

Findings for MFT Summary of Findings: The Computer Science Major Field Test was developed by the Educational Testing Service, the nonprofit organization behind such seminal, well-respected exams as the GRE, the SAT, and PRAXIS. The computer science test is a two-hour, multiple choice examination designed to test content knowledge in the areas of discrete structures (at least 15 percent), programming (at least 21 percent), algorithms and complexity (at least 16 percent), systems (at least 16 percent), software engineering (at least 3 percent), information management (at least 3 percent), and other topics such as intelligent systems, professional issues, and human computer interaction. Results are provided for individual test takers and for the university computer science program as a whole. Assessment indicators, in the form of average percentage of correct answers in particular content areas, supply feedback as to whether students in a program are having difficulty in a given subject area. In addition, a Comparative Data Guide is generated each year, which allows a program to compare its performance to that of the numerous other programs, ranging from small liberal arts college to large state research universities, which participate in the exam nationally. Information on the MFT is taken from www.ets.org, and more information may be found at this site. Due to cost constraints, this test was formerly given to graduating seniors only once every five years. However, due to a recognition of the need for more objective measurement of how well students in our program achieve outcomes and compare to those at other universities, it was agreed in 2008 that this test would be taken by graduating seniors every second year, beginning Spring 2008. The test covers Programming Fundamentals, Computer Organization and Architectures, Operating Systems, Algorithms, Computer Science Theory, and Computational Mathematics. Results of the test in the past have been used as an informal measure of program success, but no formal process was in place. For evaluation of results, a process was proposed by which the Assessment Committee would compare our students’ scores, as a group, to the national averages using a comparison of the number of standard deviations away from the national averages our students were in different categories. Any areas in which our students scored more than one standard deviation below the national mean would be brought to the attention of the full CSE faculty at the annual fall retreat for discussion. As the number of students currently in the program is small, setting percentage goals for scores at this point was thought to be impractical. However, each year, the evaluation results were to be examined for trends and evidence of weaknesses in the program by the

Page 18 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 288: Completed Academic Program Assessment Plans 20082009

Assessment Committee, and appropriate recommendations will be made to the CSE faculty. As the number of students in the program increases, use of numerical targets would be reexamined.

Target Achievement: Met

Recommendations : Results of the test are given and compared to previous results

Notes :

Substantiating Evidence:

Program Rubrics Program level; Direct - Other

Details/Description: Rubrics were identified for each A-K/L Outcome by the faculty. Each rubric is designed to assess one program outcome A-K/L and the extent that each program outcome was met was measured based on the course outcome competencies listed in each program outcome.

Target: Student work

Implementation Plan (timeline): Once a year - Spring Semester

Key/Responsible Personnel: Faculty

Supporting Attachments:

Findings for Program Rubrics Summary of Findings: The course student surveys allow us to capture the students’ input into the assessment process. The course artifacts provide the instructor’s measure of student competencies. The assessment team’s evaluation of the course artifacts and evaluation of each Outcome A-K/L gives us a measure of the effectiveness of our instruction and the extent to which students meet the set objectives and program outcomes.

Target Achievement: Met

Recommendations : Needs fine tuning.

Notes :

Substantiating Evidence:

Senior Survey Indirect - Survey

Details/Description: A survey of seniors in the CSE concentrations was developed to be given each spring in CPSC 490, a course that all senior CSE students must take as a capstone. The survey is administered through the class Blackboard site, assuring that each student takes the survey once. The survey consists of the Outcome competencies that broadly measure the effectiveness of our program. Each survey question is mapped to one or more competencies. The survey questions and the mapping of the questions to Outcome competencies is given in Appendix E. The results of the survey are used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester.

Target: Students

Implementation Plan (timeline): Once a year when CPSC 490 is taught

Key/Responsible Personnel: Instructor

Supporting Attachments:

Findings for Senior Survey Summary of Findings: A survey of seniors in the CS concentrations was developed to be given each spring in CPSC 490, a course that all senior CS students must take as a capstone, ensuring that each student is surveyed once and only once. The plan was that after the instructor of CPSC 490 administered the survey, the Assessment Committee would perform an initial collation and evaluation of the results, which would then be discussed each fall in the pre-semester retreat for all CSE faculty. Changes, additions, and emphases of the program may be altered based on these results. The form was designed to make it numerically scorable, and therefore facilitate the evaluation of trends, areas needing improvement, and effects of changes from year to year. The Outcomes section of the survey has both “scored” responses and space for seniors to comment. All surveys and results will be available for inspection by future accreditation teams. A copy of the survey given to seniors in Spring 2008 is included in Appendix B-4-6-b, along with graphs of the results of the 2008 and 2009 administrations of the survey. In 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation and comparison of results from multiple years. Results from both 2008 and 2009 surveys are discussed in the “Assessment Results” section of Appendix B-4-3. This survey will continue to be refined, and will continue to be administered in the senior capstone course each spring, to make sure that each student takes it only once, and when he or she is close to graduation.

Target Achievement: Exceeded

Recommendations : Needs fine tuning.

Notes :

Substantiating Evidence:

Outcome J

Page 19 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 289: Completed Academic Program Assessment Plans 20082009

Mapped to: No Mapping

Measures & Findings

CAAP Program level; Direct - Exam

Details/Description: Portions of the CAAP test are required of all graduates of UTC. Students are randomly chosen to take different modules of the test: Reading, Writing Skills, Writing Essay, Mathematics, Science, Critical Thinking. The results of each graduating class on these test modules are used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester. Target: Senior Students

Implementation Plan (timeline): Once a year

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for CAAP Summary of Findings: All UTC graduating seniors are required to take the Collegiate Assessment of Academic Proficiency (CAAP),, which provides measures of student General Education achievement. The CAAP, developed by the ACT organization, is a standardized, nationally-normed test which measures proficiency of five elements of UTC’s general education program. ACT is a nonprofit corporation, founded in 1959, which offers a wide variety of assessment instruments and services, including the ACT test required by many colleges and universities for admission. More information on the CAAP test may be found at http://www.act.org/caap/, which is the source of the information in this report. As described on the web site http://www.utc.edu/Administration/PlanningEvaluationAndInstitutionalResearch/caap.php, graduating seniors are required to take the CAAP test as a condition of graduation from UTC, but each student is randomly assigned to be tested in only one of the five areas: writing skills, mathematics, reading, science, and critical thinking. Results are reported for the Computer Science program, the College of Engineering and Computer Science, and UTC as a whole, allowing comparisons among majors at UTC. National statistics also allow comparison of UTC students’ level of proficiency with that of students in a variety of universities across the United States. The original plan was that CS majors would be compared to all college majors, then all UTC students, then to national averages. However, the small number of CS graduates makes this impracticable. It would take a minimum of five graduates to have even one student taking each section of the test, and since students are randomly assigned to sections, having five students take the test would not insure that each section would provide a even a single result. This would not allow any results obtained to be statistically significant. Even looking at graduates of the college of Engineering and Computer Science still provides a very small sample size. However, since our students take the same general education courses as the other students at UTC, and since our students are, based on incoming qualifications, at least as well qualified as those in other majors, we feel justified in examining the performance of UTC students as compared to national norms, and extrapolating those results to our CS majors, to provide one measure of how well the outcomes covered by this test are met by our students. These results are not the only measure we apply, but since our other measures, such as surveys, are subjective, this objective measure has been retained, although it is imperfect. We will continue to use the information which CAAP provides, while continuing to search for any additional objective measures of this content. The results currently available are those of the 2008 and 2009 exams.

Target Achievement: Met

Recommendations : We need to give this exam more frequently.

Notes :

Substantiating Evidence:

MFT Program level; Direct - Exam

Details/Description: The Major Field Test is given to seniors every two years. The results from the three subsections (Programming, Discrete Structures and Algorithms and Systems) are also used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester. Target: senior Students

Implementation Plan (timeline): Every 3 years

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for MFT Summary of Findings: The Computer Science Major Field Test was developed by the Educational Testing Service, the nonprofit organization behind such seminal, well-respected exams as the GRE, the SAT, and PRAXIS. The computer science test is a two-hour, multiple choice examination designed to test content knowledge in the areas of discrete structures (at least 15 percent), programming (at least 21 percent), algorithms and complexity (at least 16 percent), systems (at least 16 percent), software engineering (at least 3 percent), information management (at least 3 percent), and other topics such as intelligent systems, professional issues, and human computer interaction. Results are provided for individual test takers and for the university computer science program as a whole. Assessment indicators, in the form of average percentage of correct answers in particular content areas, supply feedback as to whether students in a program are having difficulty in a given subject area. In addition, a Comparative Data Guide is generated each year, which allows a program to compare its performance to that of the numerous other programs, ranging from small liberal arts college to large state research universities, which participate in the exam nationally. Information on the MFT is taken from www.ets.org, and more information may be found at this site. Due to cost constraints, this test was formerly given to graduating seniors only once every five years. However, due to a recognition of the need for more objective measurement of how well students in our program achieve outcomes and compare to those at other universities, it was agreed in 2008 that this test would be taken by graduating seniors every second year, beginning Spring 2008. The test covers Programming Fundamentals, Computer Organization and Architectures, Operating Systems, Algorithms, Computer Science Theory, and Computational Mathematics. Results of the test in the past have been used as an informal measure of program success, but no formal process was in place. For evaluation of results, a process was proposed by which the Assessment Committee would compare our students’ scores, as a group, to the national averages using a comparison of the number of standard deviations away from

Page 20 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 290: Completed Academic Program Assessment Plans 20082009

the national averages our students were in different categories. Any areas in which our students scored more than one standard deviation below the national mean would be brought to the attention of the full CSE faculty at the annual fall retreat for discussion. As the number of students currently in the program is small, setting percentage goals for scores at this point was thought to be impractical. However, each year, the evaluation results were to be examined for trends and evidence of weaknesses in the program by the Assessment Committee, and appropriate recommendations will be made to the CSE faculty. As the number of students in the program increases, use of numerical targets would be reexamined.

Target Achievement: Met

Recommendations : Results of the test are given and compared to previous results

Notes :

Substantiating Evidence:

Program Rubrics Program level; Direct - Student Artifact

Details/Description: Rubrics were identified for each A-K/L Outcome by the faculty. Each rubric is designed to assess one program outcome A-K/L and the extent that each program outcome was met was measured based on the course outcome competencies listed in each program outcome. Target: All Students

Implementation Plan (timeline): Every semester

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for Program Rubrics Summary of Findings: The course student surveys allow us to capture the students’ input into the assessment process. The course artifacts provide the instructor’s measure of student competencies. The assessment team’s evaluation of the course artifacts and evaluation of each Outcome A-K/L gives us a measure of the effectiveness of our instruction and the extent to which students meet the set objectives and program outcomes.

Target Achievement: Met

Recommendations : Needs fine tuning.

Notes :

Substantiating Evidence:

Student/Course Survey Course level; Indirect - Survey

Details/Description: For each core course a team composed of those faculty who have special expertise with the course (they have taught the course or subsequent courses that have prerequisite competencies from the course) prepares a list of course objectives and a student survey designed to gauge the students’ perception of their attainment of the course objectives. The team identifies the mapping of course objectives to Outcome competencies. The team reviews the course objectives and mapping each spring or more often if review is requested by faculty teaching the course.

Findings for Student/Course Survey Summary of Findings: The results of the student survey and portfolios of typical student work are included with the course report. The portfolios consists of all of the exercises and exams for at least three students – one excellent (A) student, one average (B-C) student and one poor (D-F) student – is included in the course report. The student survey results and portfolios in the course reports are used in the course review and evaluation of the appropriate competencies for Program Outcomes A-K/L.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Page 21 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 291: Completed Academic Program Assessment Plans 20082009

Target: All students

Implementation Plan (timeline): Every semester

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Outcome K

Mapped to: No Mapping

Measures & Findings

Alumni Survey Program level; Indirect - Survey

Details/Description: The survey of alumni is the primary means of measuring the opinions of the alumni on how well the outcomes were achieved at the time of graduation, from a more mature perspective, provide a valuable indicator of outcomes as well. The survey is a contneous online survey with enough security parameters to allow only legitimate alumni.

Target: Alumni Students

Implementation Plan (timeline): Online

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for Alumni Survey Summary of Findings: The survey of alumni was initiated in spring 2008, and was planned to take place each spring semester. It was planned that graduates would be surveyed three years after graduation to give them a chance to become fully functioning Computer Science professionals, and each graduate would be surveyed only once. The survey conducted spring 2008, of 2005 graduates, was to serve as a baseline against which future results would be compared. Many of the same questions (e.g., regarding successful fulfillment of Outcomes) were asked on both the alumni survey and the senior survey. However, it was thought that the three years of professional practice between the two should give the graduate a more mature perspective, and a much more realistic assessment of how well this program actually prepared him or her for the profession – for this reason, the alumni survey would be useful in assessing fulfillment of both the Outcomes and the Objective of the program. The survey also included questions regarding such issues as continuing education, what the graduate now most wishes had been taught, commitment to lifelong learning (which is especially critical in the Computer Science profession) and strengths and weaknesses of the program. Some objective information, such as rates of graduate school acceptance and increasing levels of responsibility of alumni, could also be collected using this instrument. The surveys would also be used in the periodic assessment of the mission and objective, and in determining how well our outcomes imply our objective of preparing the student for successful practice of the Computer Science profession. As with the senior survey, in 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation, and comparison of results from multiple years. Results: Target Achievement: Met Recommendations : Recommendations : This survey will continue to be refined, and will continue to be administered each spring. The initial plan was for each student to be surveyed only once, three years after graduation, but with the web application, alumni from more than one graduation year responded. How this should be handled will be addressed by the Assessment Committee in the 2009-10 school year. Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Course Objects/Student Survey Findings for Course Objects/Student Survey

Page 22 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 292: Completed Academic Program Assessment Plans 20082009

Program level; Indirect - Survey

Details/Description: For each core course a team composed of those faculty who have special expertise with the course (they have taught the course or subsequent courses that have prerequisite competencies from the course) prepares a list of course objectives and a student survey designed to gauge the students’ perception of their attainment of the course objectives. The team identifies the mapping of course objectives to Outcome competencies. The team reviews the course objectives and mapping each spring or more often if review is requested by faculty teaching the course.

Target: Students

Implementation Plan (timeline): Every semester

Key/Responsible Personnel: Instructor

Supporting Attachments:

Summary of Findings: The results of the student survey and portfolios of typical student work are included with the course report. The portfolios consists of all of the exercises and exams for at least three students – one excellent (A) student, one average (B-C) student and one poor (D-F) student – is included in the course report. The student survey results and portfolios in the course reports are used in the course review and evaluation of the appropriate competencies for Program Outcomes A-K/L.

Target Achievement: Met

Recommendations :

Notes :

Substantiating Evidence:

Industrial Advisory Board/Employers Contacts Program level; Indirect - Focus Group

Details/Description: Informal industry and employer interactions and inputs have always been part of the program. These encounters include informal focus groups with a few faculty members, faculty consulting, student internships, IEEE participation, and the College of Engineering and Computer Science Industrial Advisory Board (IAB). The College Industrial Advisory Board has been in place for many years, but is more focused on Engineering than Computer Science. The CSE department has established a departmental Industrial Advisory Board to provide more specific feedback on the needs of local industry related to our programs and the performance of our graduates whom they have employed. Target: Program

Implementation Plan (timeline): Twice a year

Key/Responsible Personnel: Dept. Head

Supporting Attachments:

Findings for Industrial Advisory Board/Employers Contacts Summary of Findings: The board was established in fall 2007, and the first meeting of the board took place on January 24, 2008, and at least one meeting has taken place in each semester since that time. A prйcis of the board meeting is included in the “Assessment Results” section of Appendix B-4-3. All minutes of the meetings with the advisory board will be available for inspection by the members of the ABET team at the time of the fall visit. The evolution of the department assessment processes has led to the establishment of Reading Day, the day between the last day of classes and the first day of exams in each semester, as the regularly scheduled meeting of the Advisory Board with the department faculty, giving two regularly scheduled meetings each school year. However, in the event of special needs, additional meetings with the board will be scheduled. The members of the advisory board have demonstrated a strong commitment to supporting our programs now and in the future.

Target Achievement: Exceeded

Recommendations : More informal meetings.

Notes :

Substantiating Evidence:

Program Rubrics Program level; Direct - Other

Details/Description: Rubrics were identified for each A-K/L Outcome by the faculty. Each rubric is designed to assess one program outcome A-K/L and the extent that each program outcome was met was measured based on the course outcome competencies listed in each program outcome.

Target: Student work

Implementation Plan (timeline): Once a year - Spring Semester

Key/Responsible Personnel: Faculty

Supporting Attachments:

Findings for Program Rubrics Summary of Findings: The course student surveys allow us to capture the students’ input into the assessment process. The course artifacts provide the instructor’s measure of student competencies. The assessment team’s evaluation of the course artifacts and evaluation of each Outcome A-K/L gives us a measure of the effectiveness of our instruction and the extent to which students meet the set objectives and program outcomes.

Target Achievement: Met

Recommendations : Needs fine tuning.

Notes :

Substantiating Evidence:

Senior Survey Indirect - Survey

Details/Description: A survey of seniors in the CSE concentrations was developed to be given each spring in CPSC 490, a course that all senior CSE students must take as a capstone. The survey is administered through the class Blackboard site, assuring that each student takes the survey once. The survey consists of the Outcome competencies that broadly measure the

Findings for Senior Survey Summary of Findings: A survey of seniors in the CS concentrations was developed to be given each spring in CPSC 490, a course that all senior CS students must take as a capstone, ensuring that each student is surveyed once and only once. The plan was that after the instructor of CPSC 490 administered the survey, the Assessment Committee would perform an initial collation

Page 23 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 293: Completed Academic Program Assessment Plans 20082009

effectiveness of our program. Each survey question is mapped to one or more competencies. The survey questions and the mapping of the questions to Outcome competencies is given in Appendix E. The results of the survey are used in the faculty evaluation of all Outcomes A-K/L at the end of the spring semester.

Target: Students

Implementation Plan (timeline): Once a year when CPSC 490 is taught

Key/Responsible Personnel: Instructor

Supporting Attachments:

and evaluation of the results, which would then be discussed each fall in the pre-semester retreat for all CSE faculty. Changes, additions, and emphases of the program may be altered based on these results. The form was designed to make it numerically scorable, and therefore facilitate the evaluation of trends, areas needing improvement, and effects of changes from year to year. The Outcomes section of the survey has both “scored” responses and space for seniors to comment. All surveys and results will be available for inspection by future accreditation teams. A copy of the survey given to seniors in Spring 2008 is included in Appendix B-4-6-b, along with graphs of the results of the 2008 and 2009 administrations of the survey. In 2009, the survey was altered to reflect the newly approved outcomes for the ISA programs and for the Scientific Applications and Software Systems concentrations in the department; since these were the first officially approved outcomes for the ISA program, the 2008 results for the outcomes for the two other concentrations in the department are not shown. The survey was also converted to a web format for easier administration and easier collation and comparison of results from multiple years. Results from both 2008 and 2009 surveys are discussed in the “Assessment Results” section of Appendix B-4-3. This survey will continue to be refined, and will continue to be administered in the senior capstone course each spring, to make sure that each student takes it only once, and when he or she is close to graduation.

Target Achievement: Exceeded

Recommendations : Needs fine tuning.

Notes :

Substantiating Evidence:

Page 24 of 24Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 294: Completed Academic Program Assessment Plans 20082009

Report: Assessment Plan Details for: Engineering: MS

Report Generated by TaskStream

Workspace: Academic Program Assessment

Assessment Plan: 2008-2009 Assessment Cycle: Assessment Plan and Assessment Findings

Assessment Plan Template: Academic Program Assessment

Report Generated: Friday, August 06, 2010

Measures and Findings

MS Engineering Outcome Set

Outcome

Fundamental Knowledge

Mapped to: No Mapping

Measures & Findings

Fundamental Knowledge Course level; Direct - Other

Details/Description: There are 6 concentrations in the Engineering master's program, such as Chemical, Civil, Computational, Electrical, Industrial, and Mechanical. Graduate students are required to take 3-5 core graduate courses in their disciplines.

Target: 80% of students will make a minimum of B (3.0/4.0) in all core courses.

Implementation Plan (timeline): Each semester

Key/Responsible Personnel: Instructor

Supporting Attachments:

Findings for Fundamental Knowledge Summary of Findings: 80% of students passed their core courses with a minimum grade of B. Target Achievement: Met

Recommendations : We would like to increase this ratio to 85% for next year.

Notes :

Substantiating Evidence:

Communication

Mapped to: No Mapping

Measures & Findings

Communication Course level; Direct - Other

Details/Description: Students need to communicate effectively in written and oral format with their classmates and instructor in each core course. Participation is an indication that students understand the concept of the course material. This will be evaluated using a combination of class interaction and peer evaluations that are used for group projects in core courses.

Target: 80% of students will get an average of 3.0/5.0 from their peer review evaluation.

Implementation Plan (timeline): Each semester

Key/Responsible Personnel:

Findings for Communication Summary of Findings: 80% of students got an average of 3.0/5.0 from their peer review evaluation. Target Achievement: Met

Recommendations : We would like to increase this ratio to 85% for next year, and we are going to do this through by having stricter participation policy. Notes :

Substantiating Evidence:

Page 1 of 2Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...

Page 295: Completed Academic Program Assessment Plans 20082009

Instructor

Supporting Attachments:

Research

Mapped to: No Mapping

Measures & Findings

Research Course level; Direct - Other

Details/Description: Students are required to write either a thesis or project by applying the engineering tools and techniques that they learned in the program and present it. Students must form a committee with an adviser and at least two committee members, one from the Department, and one from outside, and work with them throughout their thesis or project. Students performance is measured by grading their written report and using a checklist to measure their presentations.

Target: 90% of students will make a minimum 80/100 in their capstone checklist and written report.

Implementation Plan (timeline): Each semester

Key/Responsible Personnel: Thesis or Project Adviser and Committee

Supporting Attachments:

Findings for Research Summary of Findings: 90% of students made a minimum 80/100 in their their thesis or project checklist and written report.

Target Achievement: Met

Recommendations : We'd like to increase this ratio to 85/100 for 90% of students.

Notes :

Substantiating Evidence:

Diversity

Mapped to: No Mapping

Measures & Findings

No measures specified

Page 2 of 2Outcome Assessment Details

8/6/2010http://folio.taskstream.com/Folio/CIPReports/AMSReports_outcome_assessment_detail.asp...