assessment task force final report task force final... · following a successful ten year...
TRANSCRIPT
1
ASSESSMENT TASK FORCE
FINAL REPORT
Mission, Goals,
Outcomes
Measurement & Data
Results & Analysis
Use of Results &
Analysis for
Improvement
MARCH 2011
2
Assessment Task Force Members
Nedra AlcornAssociate Vice President
Office of Student Services
Rondall Allen, PharmD.Associate Dean for Student Affairs and Curricular Assessment
College of Pharmacy
B. Cecile Brookover, MBA, Ph.D., ChairDirector for Institutional Effectiveness and Assessment
Office of Planning, Institutional Research and Assessment
Jacques Detiege, M.S.Assessment Specialist
Office of Academic Affairs
Van Allen Gale, MBASenior Institutional Research Analyst
Office of Planning, Institutional Research and Assessment
Monique Guillory, Ph.D.Special Assistant to the Administration
Xavier University of Louisiana
Elizabeth Yost Hammer, Ph.D.Director, Center for Advancement of Teaching
Office of Academic Affairs
Treva Lee, Ph.D.Director for Institutional Research
Office of Planning, Institutional Research and Assessment
Jonathan G. Rotondo-McCord, Ph.D.Associate Dean, College of Arts and SciencesOffice of the Dean, College of Arts & Sciences
Lisa Schulte-Gipson, Ph.D.Associate Professor, Department of PsychologyChair, Core Curriculum Assessment Committee
3
Executive Summary
Following a successful ten year reaffirmation from SACS, Xavier initiated an assessment summit
to establish a strategic framework for assessment. While Xavier was found to be in compliance with all
SACS requirements and standards addressing assessment and outcomes assessment, the SACS On-site
Committee identified a number of areas for improvement. Members of Xavier’s SACS Leadership Team
and its Compliance Audit Committee also felt that a full review of the assessment process was
warranted and would set the stage for the next five and ten year SACS review. The summit was held on
June 15, 2010 and was facilitated by a professional consultant from Leadership Strategies. Objectives
included: (1) reach consensus on the purpose, or mission, of assessment at Xavier, (2) agree on a shared
vision and long-term goals derived from that vision, (3) identify critical success factors and barriers for
each of the long-term goals, (4) develop a list of possible strategies, or projects, for moving toward the
vision, achieving critical success factors, and/or priority strategies from the list of possible strategies,
and, (6) identify next steps, milestones, and accountabilities associated with the priority strategies.
An Assessment Task Force, chaired by Dr. Cecile Brookover, the Director for Institutional
Effectiveness and Assessment, was organized following the Summit and was charged to carry out a full
review of university assessment. The task force members represented a diverse set of nine
administrators, faculty and staff from the Xavier community1.The charge to this group included three
prime objectives: to map the areas where assessment occurs, to identify strengths and weaknesses in
the assessment process and to identify best professional practices in assessment. In addition, the task
force was charged to develop a preliminary set of recommendations, to seek input regarding these
initial recommendations, and to report to the University Planning Council. This report documents the
findings of the Assessment Task Force and their recommendations for improvement of the assessment
process at Xavier.
The state of assessment at Xavier was certified as meeting core requirements and
comprehensive standards based upon the recent SACS reaccreditation. However, in carrying out its
charge, the Assessment Task Force found that units, departments, programs, and individuals have
widely varying abilities to understand, conduct, and use assessment activities appropriately. The Task
Force also found that while units, departments, and programs carried out the process of assessment
successfully to the SACS reaccreditation review, there was widespread skepticism about the utility of the
process. In focused interviews conducted as a part of carrying out its charge, the Task Force concluded
that the assessment process in many areas was not yielding documented improvements and many
improvements were occurring outside the assessment process and were not well documented. These
improvements were not well documented. The recommendations from the Task Force are summarized
under the two major categories of The Assessment Process and Outcomes Assessment; within the two
categories the recommendations are prioritized as Essential (E) or should be adopted now; Short-Term
1 The Task Force members are Ms. Nedra Aicorn, Dr. Rondall Allen, Dr. Cecile Brookover, Mr. Jacques Detiege, Mr.Allen Gale, Dr. Monique Guillory, Dr. Elizabeth Hammer, Dr. Treva Lee, Dr. Jonathan Rotondo-McCord, and Dr. LisaSchulte-Gipson.
4
(ST) or should be adopted in one year; and Long-Term (LT) or should be adopted by the next three yearassessment cycle.
Assessment Process Recommendations
Coordination of effort and systems would improve the effectiveness of the assessment process as wellas the university’s institutional effectiveness. Resources related to conducting assessments would also
improve effectiveness. Recommendations related to the assessment process, itself, are as follows.
1. Establish a Xavier Assessment Group that serves as a monitoring and review committee forXavier assessment practices. The committee would include appointed members from academicprograms, administrative programs, and special programs as well as the Director for InstitutionalEffectiveness and Assessment, the Assessment Specialist, and the Chair of the Core CurriculumReview Committee. (LT)
2. The Director for Institutional Effectiveness and the Assessment Specialist should prepare ahandbook for assessment at Xavier that explains the assessment process at Xavier and thatincludes specific instructions for carrying out assessments. (E)
3. The Director for Institutional Effectiveness and the Assessment Specialist should develop awebsite specifically for assessment that includes the aforementioned handbook and otherresources. The website should be housed with the Office of Planning, Institutional Research andAssessment’s website. (E)
4. The cycle of five-year academic program reviews should be coordinated with the annualprogram assessments so that the program reviews incorporate the currently-assessed learningoutcomes related to the specific program being reviewed. (ST)
5. The programs with courses that are assessed in the Core Curriculum assessment process shouldcoordinate with the Core Curriculum Review by providing a method of assessing core learningoutcomes beyond the CAAP standardized test. (E)
6. Individual academic program units should coordinate their curriculum with their learningoutcomes by preparing a curriculum map to document which courses relate to the learningoutcomes being assessed. The TracDat system provides a tool to develop a curriculum map. (LT)
7. Planning, assessment, improvement, and institutional effectiveness should be bettercoordinated by fully implementing the available components of the TracDat software. (ST)
8. Student recruitment and retention plans formulated by the academic departments should beincorporated into TracDat outcomes assessment. (LT)
9. Smaller academic programs in similar disciplines should consider coordinating their assessmentby adopting some common outcomes that could be assessed as a larger sample. (ST)
5
10. Emphasize the importance of documentation in the TracDat Assessment Management System,including documentation of outcomes, measures and related criteria, and improvements basedupon analysis of data. (E)
11. Set up assessment on a continuing basis throughout the year, not just at the end of the year.(ST)
12. Adopt the recommendations from the Evaluation Group arising from the Assessment Task ForceDepartment Chair Survey conducted in Fall of 2010. (ST and LT) Details of theserecommendations can be found under the Evaluation Group section of this document.
Outcomes Assessment Recommendations
Specific recommendations related to outcomes assessment can be divided according to whether theoutcome concerns academic programs or administrative units. Recommendations for academic
programs and related support programs include the following items.
1. SACS reviewers’ comments should be used to improve assessment. (E)
2. Increase process compliance to prior levels. (E)
3. Use weaknesses identified in Summer Program assessment to improve assessment year-round.(E)
4. Have more than one assessment method for each outcome so that evidence for meeting theoutcomes will be strengthened. (E)
5. Limit the number of learning outcomes assessed in a three-year cycle to three-tofour specific(knowledge, skills, abilities) outcomes that can be assessed using multiple methods. (E)
6. Prepare a curriculum map to identify courses where outcomes can be assessed using embeddedcourse assessment employing rubrics. (E)
7. Conduct additional training for use of TracDat, outcomes assessment, results analysis, andimprovement using face-to-face and online training. (E)
Recommendations for administrative units include the following items.
1. Increase process compliance to prior levels. (E)
2. Emphasize the assessment of unit operations and processes. (E)3. Determine the core functions of the unit that allow for reaching the goals of the office.
6
4. Determine how the unit interacts with other administrative and/or academic units whencarrying out its core functions. Administrative units carry out their functions across multipleoffices. (E)
5. Choose three core functions and state as outcomes for evaluation for each three-year cycle ofassessment. (E)
6. Include the Institutional Effectiveness Survey results as an additional outcome. (E)
7. Have more than one assessment method for each outcome so that evidence for meeting theoutcomes will be strengthened. (E)
8. Conduct additional training for use of TracDat, outcomes assessment, results analysis, andimprovement using face-to-face and online training. (E)
The Assessment Task Force presents these recommendations as steps in developing a process of
assessment at Xavier University of Louisiana that will insure our university mission will be met for all
students. Additionally, adoption of and successful implementation of the recommendations will
guarantee that assessment of our educational programs and administrative processes follow best
practices that encourage continuous improvement based upon careful evaluation and evidence. These
recommendations, when implemented in a systematic fashion according to the timeline presented
above, will establish a culture of assessment for continued success.
7
INTRODUCTION
Following a successful ten year reaffirmation from SACS, Xavier initiated an assessment summit
to establish a strategic framework for assessment. While Xavier was found to be in compliance with all
SACS requirements and standards addressing assessment and outcomes assessment, the SACS On-site
Committee identified a number of areas for improvement. Members of Xavier’s SACS Leadership Team
and its Compliance Audit Committee also felt that a full review of the assessment process was
warranted and would set the stage for the next five and ten year SACS review. The summit was held on
June 15, 2010 and was facilitated by a professional consultant from Leadership Strategies. Objectives
included: (1) reach consensus on the purpose, or mission, of assessment at Xavier, (2) agree on a shared
vision and long-term goals derived from that vision, (3) identify critical success factors and barriers for
each of the long-term goals, (4) develop a list of possible strategies, or projects, for moving toward the
vision, achieving critical success factors, and/or priority strategies from the list of possible strategies,
and, (6) identify next steps, milestones, and accountabilities associated with the priority strategies.
An Assessment Task Force, chaired by Dr. Cecile Brookover, the Director for Institutional
Effectiveness and Assessment, was organized following the Summit and was charged to carry out a
complete review of university assessment. The task force members represented a diverse set of nine
administrators, faculty and staff from the Xavier community2.The charge to this group included three
prime objectives: to map the areas where assessment occurs, to identify strengths and weaknesses in
the assessment process and to identify best professional practices in assessment. In addition, the task
force was charged to develop a preliminary set of recommendations, to seek input regarding these
initial recommendations, and to report to the University Planning Council. This report documents the
findings of the Assessment Task Force and their recommendations for improvement of the assessment
process at Xavier.
The first meeting of the Assessment Task Force was held on August 11, 2010. The Chair presented a
plan for the operation of the group, and members stated their preference for assignment to one of
three working subcommittees. Subcommittee assignments were as follows:
1. Evaluation Group — Allen, Brookover, Rotondo-McCord and Schulte-Gipson
2. Best Practices Group — Alcorn, Brookover, Guillory and Hammer
3. Mapping Group — Brookover, Detiege, Gale and Lee
In February 2011, an interim Power Point presentation on Task Force activities was given by Dr.
Brookover to the University Planning Council. The present document is the final report for the
Assessment Task Force.
2 The Task Force members are Ms. Nedra Alcorn, Dr. Rondall Allen, Dr. Cecile Brookover, Mr. Jacques Detiege, Mr.Allen Gale, Dr. Monique Guillory, Dr. Elizabeth Hammer, Dr. Treva Lee, Dr. Jonathan Rotondo-McCord, and Dr. LisaSchulte-Gipson.
8
EVALUATION GROUP
The function of the Evaluation Group was to examine the strengths and weaknesses of thecurrent assessment system at Xavier. Their examination included a review of preparation for the recentSACS review, a survey on assessment sent to the academic departments, a series of key informantinterviews of assessment personnel, and a review of recent assessment reports.
Survey of College Curriculum Initiatives, 2000-2010
In December 2009, as part of Xavier University’s overall preparation for SACS reaccreditation review,the College of Arts & Sciences (CAS) Dean’s office requested the help of all college departments anddivisions in identifying important changes and revisions, made over the past decade, that have affectedeither the core curriculum, departmental programs (majors and minors), or both. Department/divisionchairs were asked to respond to the following three questions:
1. During the past few years (both before and after Katrina), what have been the most significantcurriculum revisions made by your department? Examples of such revisions could be (but are notlimited to): changes to requirements for the major, minor, honors sequences, or labs;development of new key courses: changes to core courses and/or core requirements offered byyour department; or substantial redesign of the way existing courses are taught.
2. Have any of these changes been made as a result of your department’s assessment (either director indirect) of student learning? If so, please explain.
Departmental summaries of key curriculum revisions, i.e. the natural end of any formal or informalassessment process, would better allow retrospective identification of linkages between educationaloutcomes assessment efforts—formal or informal, centralized or decentralized—and use of results toimprove programs. In order to provide context for the receht work of the newly formed Assessment TaskForce, a brief synopsis of the College Curriculum Initiatives report follows.3Since the strategy of thereport was to work “backwards” to identify points of linkage between curriculum change andassessment, the summary below will treat these two themes in that order. After a short listing ofcategory types, selected examples are given for each type of curricular revision or assessment.
Types of curricular changes:
Revisions of departmental curricula during the years before and after Katrina can be grouped intothe following main categories.
1. Introduction of new courses and revision or deletion of existing offerings.
2. Revision, introduction, and deletion of major and minor programs.
3. Revisions to existing core curriculum and adoption of new core
4. Development of interdisciplinary programs (esp. Women’s Studies)
5. Declared incorporation of fundamental themes (e.g. globalization, service learning) into thebroader curriculum
Types of assessments leading to change:
For full report, see “College of Arts & Sciences Curriculum Initiatives, 2000-Present, Xavier University ofLouisiana” (ian. 2010). Cf. also “Xavier University of Louisiana, 2010 SACS/COC Focused Report” (Feb. 2010).
9
In general, assessments, considered in the report, fall into the categories shown below.
6. Internal college and departmental self-studies (formal and informal)
7. Externally assisted evaluations
8. Student outcomes assessment (internal and external) of major programs and core
Examples:
Typically, the most common type of curricular change is (cf. item 1 above) the development of newcourses, and revision of existing ones, in order to meet students’ learning needs more effectively and tokeep a department’s offerings up-to-date and in line with new developments in research and learning.This also happens in the context of review and revision of a major program as a whole (item 2). A goodexample of both processes taking place in a complementary fashion was provided by the sociologydepartment, which between 2005 and 2010 undertook the following revisions:
1. Reduction of maximum enrollment limits in core SOCI 1010 sections, with the goal of increasingcritical thinking and writing skills and incorporating service learning (items 1, 3, and 5).
2. Development of Women’s Studies courses (item 4) Use of ETS Major Field Test results to modifythe sociology major (items 2, 7)
3. Review of nationwide discipline trends and use of both internal assessments and outsideconsultants to further improve major program (items 1, 2, 6-8)
A second important area of curricular change in response to assessment can be noted where adepartment focused on a particular key part of its curriculum for improvement, rather than an overhaulof an entire major program. A good example of this kind of change took place in the biology department,which used an internal self-study to identify weaknesses in the BIOL 1230/1240 (General Biology formajors) sequence (items 1 and 6 above). In part as a result of this self-study, the department wasawarded Louisiana Board of Regents funding to develop a multi-year program to revise and improve theBIOL 1230/1240 course sequence.
Another pattern of curriculum improvement took place in departments that chose to focus on theeffectiveness of their core curriculum offerings. This was the case with the history department which, asa result of a two-year process of discussion and departmental retreats, developed a proposal to allowstudents a much wider range of choice in fulfilling their history core requirement than had previouslybeen possible (items 3 and 6). Similar initiatives giving students greater flexibility of core course choicewere also undertaken in the departments of communication studies and theology. A greater emphasis onglobalization throughout the history curriculum (both core and upper-level) was also adopted, in part asa result of earlier pre- and post-test assessments that demonstrated low student knowledge of world andhistorical geography (items 5 and 8).
Streamlining the curriculum in the interest of greater efficiency and better use of resources has alsotaken place in some areas, especially after Katrina. Examples include the elimination of the majorprogram in microbiology (biology department), and the minor programs in law and humanities(philosophy department) and international studies (political science) (item 2 above).
Finally, a major ongoing assessment and improvement project continues to be the assessment of thecollege core curriculum, which falls under the oversight of the CAS Core Curriculum AssessmentCommittee (CCAC), with the assistance of administrative units and academic departments (items 3 and 8above). While no single assessment project led to the adoption of the new core (effective Fall 2010 for allnew students), previous departmental assessments (both formal and informal) of their own core courses
10
informed faculty discussion and deliberation during the extended process of considering the new corefor adoption. The efforts of the CCAC have continued steadily on a yearly basis since Fall 2008, so thatfuture revisions of the core will be solidly based on comprehensive data and a regular assessment plan.
Assessment Survey of Department Chairs
Rather than reduplicating the previous work of the comprehensive report described above, theEvaluation sub-committee commissioned a survey of academic department chairs, which was carried outby the Office of Planning, Institutional Research, and Assessment during fall of 2010.
Background
In the fall of 2010, the evaluation subcommittee of the Assessment Task Force developed asurvey to send to all department chairs in the College of Arts and Sciences. The survey was designed to
determine the following: What assessment items are included in the assessment plans of the various
departments? Which stakeholders are actively involved in the assessment process? Do the departmentshave adequate resources to complete the assessment activities in their respective plans?, and Are thedepartments using assessment results to improve their programs?
Demographics
Surveys were sent out to the 18 departments composing the College of Arts and Sciences.
Fourteen surveys were returned by the specified deadline (a response rate of 78%). See Appendix B for
a list of the 14 departments that completed the survey. The average number of assessment goals
developed was approximately seven ( M = 6.62, SD = 2.26), with approximately 69% of them assessed
yearly. Seventy-one percent of responding departments did not have an accrediting body, while 29% of
responding departments did have an accrediting body.
Item A - Which of the following are included in your assessment plan?
In response to the item “Which of the following are included in your assessment plan” 100% of
departments indicated that student learning outcomes were included. Ten of the 14 responding
departments (71%) indicated that student performance on senior comprehensive exams were included
in assessment outcomes. Five of the 14 responding departments (36%) indicated that department goals,
university mission, and student performance on national exams were included in their assessment plan.
Four of the 14 departments (29%) indicated that department mission and student progression were
included in their assessment plan. For a breakdown of the results by department see Appendix B, Table
1.
Item B - Who of the following is involved in developing the assessment plan for your
department/division?
In response to the item “Who of the following is involved in developing the assessment plan foryour department/division” Nine of the respondents (64%)indicated the chairperson, seven of the
respondents (50%)indicated “a designated faculty group,” and five (36%)indicated other. Other
responses suggested that all faculty are involved and, in one response, that the former chair was
involved. For a breakdown of the results by department see Appendix B, Table 2.
11
Item C - Who of the following is involved in data collection related to the assessment plan for your
department/division?
In response to the item “Who of the following is involved in data collection related to the assessmentplan for your department/division seven of the respondents (50%)indicated a designated faculty group,
seven indicated the chairperson, and six (43%) indicated other. Other responses tended to indicate
either all faculty (two responses) or individual faculty (two responses). Two respondents indicated that
no one has been assigned this responsibility in their department. Table 3 in Appendix B provides a
breakdown of this question by department.
Item D - Who of the following is involved in discussing/interpreting assessment results for your
department/division?
In response to the item “Who of the following is involved in discussion/interpreting assessment
results for your department/division nine of the respondents indicated the chairperson, six indicated adesignated faculty group, six indicated other, and five indicated a designated faculty person. In general,
other responses indicated that all faculty are involved. Table 4 in Appendix B provides a breakdown of
this item by department.
Item E - Who is involved in evaluating the effectiveness of the assessment plan?
In response to the item “Who is involved in evaluating the effectiveness of the assessment plansix of the respondents (43%) indicated the faculty and six of the respondents (43%) indicated other.
Other responses indicated that all faculty are involved (three responses) or that the chair collaborates
with one or two faculty members (one response). A breakdown by department can be found on Table 4
in Appendix B.
Item F - What priority does assessment take in relation to other departmental duties?
Overall the 14 departments ranked the six items for this question in the following order from
most important to least important: teaching, advising, scholarship, university service, assessment, and
community service. Overall assessment has the second lowest priority. Other responses to this questionincluded: “balancing all of the above and sharing with each other our thoughts and experiences in our
teaching, scholarship, etc.” and “The above account for 70 pIus hours per week. Is there any time left?”
Table 5 in Appendix B provides the responses by department.
Item G - Do you have adequate resources to assess the items in your plan?
Ten respondents (71%) replied yes to this item, while four (29%)replied no. Respondents whoreplied in the negative indicated that they needed more time and/or “human assessment experts to
evaluate the effectiveness, suggest improvements, and collect data.”
12
Item H - Are the results of the assessment plan being used to improve the department/division? Who
decides how the results are used?
Thirteen of the respondents (93%) indicated that the assessment plan is being used to improvethe department/division. Ten of the respondents (71%)indicated that faculty members decide how theresults will be used to improve the department.
Item I - How many times per academic year does your department/division meet to discuss and/orreview the assessment plan?
Respondents reported meeting an average of 2.43 (SD = 1.02) times per academic year todiscuss and/or review the assessment plan.
Item i - My department/division uses previous assessment results to revise teaching and learning
content, methods, and strategies?
On a one to seven Likert-type scale with one being strongly disagree and seven being strongly
agree, the mean response was 5.43 (SD = 1.99).
Recommendations from Analysis of the Chair Survey about Assessment
Based upon the data reported the process of assessment varies across the different programs;
some of the departments are not assessing all of their goals. All of the departments include studentlearning outcomes in their assessment plan but the majority of the departments (10 of 14, 71%) assess
student performance on the senior comprehensive exams. A third of the departments assess the
university mission and student performance on national exams. Only one-third of the respondentsreported that they include their division goals in their assessment plan. This may be a contradiction tothe responses in questions 1 and 2 or a misunderstanding of the question. There were four departmentsthat indicated they have an accrediting body. However, only one department reported that theyincluded accreditation requirements in their assessment plan.
The majority of the respondents indicated that the Department Chair and a designated faculty
group are responsible for developing the assessment plan. However, in some departments all facultymembers are engaged in the process. Although some departments include other stakeholders, thedesignated faculty group and the chairperson are primarily responsible for discussing/interpreting theassessment results. The majority of the respondents indicated that the Department Chair and adesignated faculty group are responsible for evaluating the effectiveness of the assessment plan.However, in some departments all of the faculty members are engaged in the process. Two
respondents indicated that no one had been assigned that responsibility for their department.
Ten of the respondents indicated that they have enough resources to assess the items in theirassessment plan. All of the departments except one indicated that they are using the assessment resultsto improve their respective departments. The majority of the respondents indicated that faculty areresponsible for making decisions on how the results will be used.
13
Twelve of the 14 departments state that they use previous assessment results to improve
teaching and learning content, methods, and strategies. Also results from the Department and Division
Chair Survey indicate that many chairs are using the assessments results for curricular changes and
departmental improvements. However, the actual documentation of the use of assessments in the
TracDat assessment system does not indicate the same level of use of assessment results.
Recommendations
1. The Department Chairs and the faculty should determine why all of their goals are not being
assessed.
2. The Department Chairs and the faculty should evaluate their assessment plans to ensure their
goals are relevant, meaningful, and measurable.
3. The Department Chairs should rank order goals assessing essential goals more frequently than less
essential goals.
4. The university should develop a list of core assessment items that should be included in the
departmental reviews and assessment plans of each department. The following items should be
considered for the list: student learning outcomes, department mission, department goals,
university mission, student progression, student performance on senior comprehensive exams,
student performance on national exams (if applicable), post-graduate training placement (if
applicable), and accreditation requirements.
5. All departments should consider involving all of the key stakeholders (e.g. administrators, faculty,
staff, students, and alumni) in developing the assessment plan.
6. All departments should consider involving all of the key stakeholders (e.g. administrators, faculty,
staff, students, and alumni) in discussing/interpreting the assessment results.
7. Encourage the Deans to work with the division chairs in meeting their needs to carry out the
assessment activities in their respective departments. Training and additional personnel were the
needed resources listed by the respondents.
8. Continue to encourage division chairs and the faculty to close the loop on all assessment projects
by deciding upon actions to be taken based upon assessment results..
9. All departments should consider adding a student representative to their assessment team.
10. Encourage all chairs and faculty to use assessment results to improve teaching and learning in
their respective department.
11. The university could clarify priorities of activities (advising, assessment, community service,
scholarship, teaching, and university service).
Key Informant Interviews
Another component of the Evaluation Group’s process was to obtain information from people at
Xavier who had experience in assessment or in the SACS accreditation process by conducting a set of
“key informant interviews”. Key informant interviews are a part of a needs assessment or evaluation
tool to draw upon expertise of people within an organization. Generally, they are semi-structured
14
interviews, where the interviewer begins with a set of prepared questions, but additional questions may
be asked depending on the responses given. The prepared questions were as follows:
1. Briefly describe your experience with assessment at Xavier.
2. Given your specific responsibilities for assessment at Xavier, what can be done to improve thatportion of assessment?
3. In your opinion, what is the strongest aspect of evaluation at Xavier and why?
4. In your opinion, what is the weakest aspect of evaluation at Xavier and why?
5. How could the process of assessment be improved?
6. Overall, what can be done to improve assessment at Xavier?
7. Please add anything that you think would help our committee with its task.
Interviews were conducted with Dr. Marguerite Giguette, Associate Vice President for Academic Affairs;
Dr. Ronald Durnford, Vice President of Planning, Institutional Research and Assessment; Dr. Rondall
Allen, Associate Dean for Student Affairs and Curricular Assessment, COP; Dr. Linda Mihm, Clinical
Associate Professor, Division Of Clinical and Administrative Sciences, COP; Dr. Lisa Schulte-Gipson,
Associate Professor, Department of Psychology and Chair of the Core Curriculum Assessment
Committee; and Mr. Jacques Detiege, Assessment Specialist for Academic Affairs. The following table
summarizes responses to selected questions from the Key Informant Interviews.
Summary of Key Informant Interviews
Question ResponsesA - What is the strongest aspect of 1-In general people are completing their assessments andassessment at Xavier and why? have a decent understanding of them.
Core Curriculum assessment has progressed a long way andhas an established plan thanks to the help of JonathanRotondo-McCord and others.2-In principle, it is comprehensive. Each unit assessesoutcomes; faculty and staff are periodically reviewed; bothadministrative and academic units are periodicallyevaluated.3-I believe that the culture of assessment is changing. Manyindividuals may not like assessment, but it seems that (moreand more) they realize the importance of assessment.4-Our Program Assessment Committee (COP)5-The committee to performing assessment, but there is notenough time to do it6-Capacity of the assessment staff
B - What is the weakest aspect of 1-Assessment is more focused on process rather than tyingassessment at Xavier and why? learning outcomes to the curriculum. The biggest weakness
is that assessment is not used for program changes. Weneed to ask “How has assessment led to this curriculum change?”
15
Question Responses
2-Most change occurs outside the formal assessmentprocess; post-SACS process compliance has tapered off;the assessment too often does not lead to systematicimprovements; assessment driven actions andimprovements are not always well documented; outcomesand measures are often pro-forma
3-Speaking from the perspective of the CCAC, the process ofdeveloping valid means to assess all aspects of the core isquite challenging.4-Faculty engagement and assessment resources
5-Lack of expertise and training
6-Understanding and appreciation of the value ofassessment among the general staff
C - How could the process of assessment 1-The assessment cycle for programs has a due date that
be improved at Xavier? occurs when generally only Chairs are on campus.Whoperforms the “assessment of the assessments” andgenerates recommendations? Review of assessments shouldbe conducted at more than one level (currently Director ofIR and Assessment and Planning, IR and Assessment office))and include an Academic Affairs voice via the level of Deanand above.Refine TracDat language to avoidmisunderstanding of the phrase “close the loop”.Determinewhy an action plan is required for all outcomes that havebeen met.
2-To improve, assessment must be perceived to be and mustbe useful. A combination of training, engagement,persuasion, feedback and encouragement. The process andtimelines need to be clear. Perhaps an assessment cribsheet. Examples provided of effective and less-than-effective outcomes and measures. Automatic reminderswith escalation.
3-Greater standardization in some (not all) areas. As withthe previous example of administration of CAAP & CAT tests
4-Designated data person; Designated assessment person;Assessment in a timely manner; Better dissemination
5-Assessment data person, Database expert
6-Consequences for not complying with required documentation;Rapid turnaround in reporting of assessment information collected
16
Responses1-There needs to be more uniformity across programs inassessment. For example, should we consider using the ETSmajor field test as a senior comprehensive exam for allprograms along with consistent criteria for passing?Currently Core Curriculum assessment is treated separatelyfrom program assessments, but specific core learningoutcomes, such as writing, communication, etc. could be apart of departmental assessments.Program learning outcomes must address more studentsthan just the majors in the program.Develop a system to handle assessment of smalldepartments with very few majors (related to priorstatement).2-Some of the pieces need attention:better integration of institutional effectiveness surveyresults and follow up detail-driven surveys; incorporation ofclass evaluations into departmental learning outcomes;revision of program review;more effective administrative unit review.3-Focus on improvement of processFocus on ensuring that assessments are valid4-Separate assessment department5-Separate assessment department6-Person specifically assigned to manage data and databasesfor the university as their only job
The Key Informant Interviews produced strengths and weakness of the assessment process that
harmonized with the other analyses cited by the Evaluation Group. However, some distinctive or
emphasized elements were found, including integration of Institutional Effectiveness Survey results into
assessments of administrative units, acquiring a database and data person to insure good data for
assessments, engagement in the assessment process by Xavier faculty and staff, more effective
administrative unit review, incorporation of class evaluations into the departmental learning outcomes,
and revision of the academic program review process.
Summary of Recent Assessment Reports
During the SACS reaccreditation process the following strengths and weakness were identified
by the reviewers.
1. Process compliance is good
2. Clear weaknesses in many outcomes and measurement of them
QuestionsD - Overall, what can be done toimprove all assessment at Xavier?
17
3. Missing opportunity to link student learning outcomes to academic program reviews
Improvements occurring outside the assessment process
4. Assessment process not driving improvements and change
In the report on Summer Programs the Assessment Specialist identified the following issues.
1. Documentation of reports in TracDat
2. Timeliness of data collection and reporting
3. Quality/Consistency of data and measurement
4. Documentation of result status
In the 2009-2010 Assessment Focus Report the Director for Institutional Effectiveness and
Assessment raised these areas of concern based upon her analysis of assessment at Xavier.
1. Lack of timely compliance with the performance and documentation of assessment in TracDat is
a serious problem for both academic departments and non-academic units.
2. Quality of outcomes and measurement is weak in many areas often with only one measurement
or criterion for a single, very specific outcome. Outcomes need to be broader with multiple
methods of measurement.
3. Very little documentation can be found for improvements and change related to assessment.
4. Group and 1-to-i training is needed in using the TracDat system, writing and measuring learning
and other outcomes, and documenting improvements and change.
5. New learning and other outcomes are needed for the three year cycle beginning in academic
year 2010-2011 with an emphasis on broader outcomes and multiple means of assessment for
the outcomes.
6. High level training in the TracDat assessment system is needed for the Assessment Specialist and
the Director of Assessment
High level training on the new version of TracDat occurred on January 24th and 25th of the new year.
18
BEST PRACTICES GROUP
The task set forth for the Best Practices Group was to perform a literature review of best
practices by examining all resources available on the professional practice of assessment. This
examination included review of hundreds of articles, documents and websites related to assessment.
The American Association for Higher Education (AAHE) is a national organization that includes members
from the entire spectrum of higher education stakeholders — administrators, faculty members,
policymakers, accrediting agencies, government, students, and business. AAHE published and
disseminated a document entitled Nine Principles of Good Practice forAssessing Student Learning (Astin
et al., 1996), which can be summarized as follows.
1. The assessment of student learning begins with educational values. We should assess what wevalue in producing graduates from our universities and use assessment to improve our ability toeducate our students. The values of an educational institution are found in its missionstatement.
Example — Alverno College, Mission Statement. Alverno College is an institution of highereducation dedicated to the undergraduate education of women. The student — her learning andher personal and professional development — is the centralfocus of everyone associated withAlverno.Alverno College expands on its mission statement with explication of four major purposes. The
first major purpose is directly related to academic program offerings and is shown below.
Creating a curriculum
The curriculum, designed by faculty as the major source for student attainment of educational
goals, includes both a philosophy and a program of education. It is:
• ability-based and focused on student outcomesintegrated in a liberal arts approach to the professions
• rooted in Catholic tradition• designed to foster leadership and service in the community• flexible, to accommodate the educational goals of women with diverse responsibilities• affordable, to accommodate women’s economic circumstances
• Xavier - Xavier University of Louisiana, founded by Saint Katharine Drexel and the Sisters of theBlessed Sacrament, is Catholic and historically Black. The ultimate purpose of the University is tocontribute to the promotion of a more just and humane society by preparing its students toassume roles of leadership and service in a global society. This preparation takes place in adiverse learning and teaching environment that incorporates all relevant educational means,including research and community service.
2. Assessment is most effective when it reflects an understanding of learning asmultidimensional, integrated, and revealed in performance over time. As educationalinstitutions the mission and goals of the university flows downward into establishing missionsand goals for the colleges, departments, core curricula, special programs, student services, and
19
administrative services for the university resulting in an integrated system, which is measured incycles of time and which includes multiple outcomes and multiple methods of measuring thoseoutcomes. Such a system allows for a strong structure for determining how to improve theeducation of students.
Example — Pennsylvania State University: Assessment occurs throughout the University. It isfound in classrooms, programs, departments, offices and colleges, and ultimately at theUniversity level (Pennsylvania State University Office of Planning and Institutional Assessment.(2005) Assessing for Improvement. Innovation insights, 11)http://www.psu.ed u/president/pia/innovation/Assessing for Improvement 2.pdf/
• Xavier—Xavier: Xavier has worked to incorporate all assessment and assessment records into acentral repository using the TracDat assessment system.
3. Assessment works best when the programs it seeks to improve have clear, explicitly statedpurposes. The process of assessment requires that each unit in the institutional hierarchy is inagreement on the goals that it wishes to achieve and where in the student learning frameworkthose goals can be realized. When performed properly the assessment process clarifies what isimportant, allowing each unit to focus on useful assessment that leads to improvement ratherthan an exercise in measuring what you know you can easily achieve.
• Example — University of Central Florida, Academic Assessment Handbook.The purpose of thischapter [DEFINING STUDENT LEARNING OUTCOMES (SLO)] is to provide you with anoverview and definition of student learning outcomes. The importance of explicitly definingexpectations and standards is emphasized in this chapter. Also included is an extensivediscussion on how to write clear and precise statements of outcomes for your program.http://oeas.ucf.edu/doc/acad assess handbook.pdf
• Xavier — Xavier has begun a series of training for academic and administrative departments thatemphasizes writing mission statements, goals, objectives and student learning and otheroutcomes that are clearly stated with precise criteria for measurement.
4. Assessment requires attention to outcomes but also and equally to the experiences that leadto those outcomes. In educating students universities produce graduates that have particularskills, abilities and knowledge. In order to produce graduates of quality the university needs tohave outcomes that measure all conditions and experiences that shape the students, includingspecific courses, curricula, programs, and extra-curricular experiences.
• Example — The University of Texas System. In higher education, we typically talk aboutknowledge, skills, abilities, and competencies as being one and the same. For example, we speakof competent mathematicians and knowledgeable mathematicians. Yet, skills and knowledgeare acquired through learning experiences; the different combinations of skills and knowledgeone has acquired in a given program define the cornpetencies an individual possesses. Thesecornpetencies are acquired through integrative learning experiences provided by academicprograms. (Conceptual Framework, 2000).http://www.utsystem.edu/aca/initiatives/assessment/conceptualFramework.htm
20
• Xavier — Xavier has begun encouraging multiple measures that include all aspects of studentlearning in the next three-year cycle of outcomes assessment that begins with the 2010-2011academic year.
5. Assessment works best when it is ongoing not episodic. Quality assessment is not a hectic pushthat occurs whenever accreditation rolls around, but a process of tracking outcomes in whichindividuals and groups of students are measured from semester to semester, year to year, inmultiple-year cycles in their progress toward stated outcomes. The ongoing process ofassessment leads to ongoing improvement in achieving outcomes as well as to improvement inthe assessment process, itself.
a Example — North Carolina State University guidelines. The assessment cycle is continuous.It should identify/document strengths, weaknesses, needs, improvements and futureplans. http://www.ncsu.edu/uap/academic-standards/uapr/process/language.html
• Xavier - Xavier has completed its first three-year assessment cycle using the TracDat assessmentmanagement system and has begun its next three-year cycle using new outcomes.
6. Assessment fosters wider improvement when representatives from across the educationalcommunity are involved. Every part of the university community, including constituenciesbeyond the campus, contributes to the quality of students that are produced. This means thateach university should encourage a culture of assessment across its campus and beyond. Thereshould be no question that the process of assessment is respected and valued and thatassessment activities are completed with care on a timely basis.
a Example — St. Olaf College, winner of a 2010 Council for Higher Education AccreditationAward. St. Olaf College is a nationally ranked liberal arts college with an enrollment ofapproximately 3,000. The college has developed an innovative model of assessment thatsustains and supports student learning. Strong faculty leadership, extensive faculty andstaff engagement, administrative support, grant-funded inter-institutional partnershipsand student engagement in developing instruments and analyzing results arecomponents of the college’s assessment work.
• Xavier— Xavier is working to create a culture of assessment that reaches across the entireuniversity. Its first steps were the collaboration across the entire Xavier community to obtainSACS reaccreditation and to develop its QEP, Read Today, Lead Tomorrow.
7. Assessment makes a difference when it begins with issues of use and illuminates questionsthat people really care about. The process of assessment should produce information andevidence that is deemed useful for making decisions in a system of continuous improvement.The goal of assessment is not to generate data that pats the institution, department or unit onits back, but that leads to making good decisions about how to proceed in the future.
21
• Example — North Carolina State University. Objectives and outcomes help translate the verybroad goals of university, college, and department mission statements into the curriculum bywhich you fulfill that mission. They describe the knowledge, attitudes, and skills you wantstudents to have when they finish a part of your program. They can apply to individualassignments, to courses, or to whole programs.
• Xavier — New guidelines for writing outcomes for academic and administrative units at Xavierhave each program or office to generate a list of the essential attributes their students shouldpossess or functions that the administrative unit should perform. For academic programs thatdescribes what knowledge, skills and abilities students should possess upon completion. Next,outcomes are written with appropriate measurement so that programs and offices candetermine if the outcomes are achieved.
8. Assessment is most likely to lead to improvement when it is part of a larger set of conditionsthat promote change. Assessment needs to be part of the institution’s culture so that it isintegrated into all decision-making processes such as planning, budgeting, personnel, andcurricula. For example, curriculum changes should also be made based upon evidence producedby assessment.
• Example — Pennsylvania State University. Creating a Culture for Innovation andImprovement: Lessons Learned: Integrate CQI [Continuous Quality Improvement] into thecore processes of the institution — how you hire, what you reward, what you communicate, howyou measure, and how you develop faculty and staff. What gets measured is what gets done,and what’s valued is what is rewarded.httrx//www.psu.ed u/president/pia/innovation/creating a culture.pdf
• Xavier — One way Xavier is considering implementing a culture of assessment is to requirecurriculum change proposals to include assessment data. Core Curriculum assessment data ledto the choice of reading as the focus of the QEP, Read Today, Lead Tomorrow.
9. Through assessment, educators meet responsibilities to students and to the public. Aseducators we have a responsibility to foster quality and improvement in educating our students;assessment provides us with the mechanism to do so. Also, by fulfilling this responsibilityourselves, universities are better able to control imposition of one-size-fits all mandates fromgovernment.
a Example — The University of Texas System, Our Commitment to Accountability. One of thehighest priorities of the Chancellor and the Board of Regents of The University of TexasSystem is to be accountable — to take responsibility for measuring and reporting theeffectiveness of our work and to use that information to continuously improve ourperformnce. http://www.utsystem.edu/osm/files/FactSheet.pdf
22
‘ Xavier — In order to fulfill our mission, which is basically an implied promise to our students andthe public, Xavier must continue to improve the quality and reporting of our assessment system.The Assessment Summit and this Assessment Task Force Report provide a plan for the activationof a system of integrated best practices in assessment for Xavier University of Louisiana that willlead to university improvement.
Through a review of approximately 75 college and university websites for institutions that grant at aminimum a four-year undergraduate degree the following communalities were noted. Almost alluniversities have an assessment committee composed of members from the greater community whohave expertise or interest in assessment. The committees are usually appointed. The majority ofinstitutions have an assessment website with links to resources related to assessments best practicesand other information. Almost all have an assessment handbook that spells out the assessment processfor the university. Some have separate handbooks for administrative units, and almost all includeassessment forms and examples of assessments that meet the requirements set forth in the handbooks.
Some of the best sources for best practices are listed in the annotated bibliography in AppendixC. The resources are divided into the categories of General Assessment, Culture of Assessment,Measurement and Evidence, and Using Assessment for Improvement. Links are provided to theresources when available. These links can be placed upon an Assessment Web Page.
23
MAPPING GROUP
The purpose of the Mapping Group was to document all assessment activities at Xavier. First,
members of the Task Force all documented their own involvement with assessment at Xavier. See the
charts below for documentation for the Offices of Academic Affairs and Planning, Institutional Research
and Assessment. In the Center for Advancement of Teaching assessment are reported via the TracDat
system, annual FaCTS reports to the Mellon Foundation, and Internal reports for annual planning.
Results from these reports are used to plan for the next year’s programming. Assessment in the College
of Pharmacy is set up with a Program Assessment Committee, which includes members from Basic
Sciences and Clinical Faculty members, administrators from the College, and the Associate Dean for
Student Affairs and Curricular Assessment. See the table below listing the primary offices and positions
within the offices that have assessment responsibilities at Xavier.
Records were obtained from the Office of Fiscal Services, Gants and Contracts Accounting, a
listing of all active grants and funds. Analysis of this list revealed that many of the items were actually
research grants that required an annual progress report, but no actual evaluation. An example of that
type of item would be research grants funded under the NIH-NIGMS SCORE Grant Program. Using the list
as a guide, a survey was sent to determine assessment activities related to grants and contracts (see
Appendix D for the survey results). In order to be comprehensive surveys were sent to all persons listed
as Principal Investigators on the list. Responses were received from 32 PIs. Follow-up emails, telephone
calls and investigation resulted in compiling information about assessment activities on all items on the
list. All items required an annual progress report, and 34 items required an evaluation in the annual
report. Of those 34, external evaluators (outside of Xavier) were required for eight of them (24%);
internal evaluators (from Xavier) performed the assessments for the remaining 26 (76%). Most internal
assessments are performed by the Assessment Specialist or Director for Institutional Effectiveness.
24
Assessment Responsibilities at Xavier
Office Position Assessment ReportingResponsibilities
Office of Vice President for • SACS Liaison • SACS Liaison and SACSPlanning, Planning, Reports
• Emphasis on SACS 2.5,Institutional Institutional
3.3.1, and 3.2.8 • Supervisor for OPIRAResearch, and Research, &
activities, reports, andAssessment Assessment • Planning
(OPIRA)surveys
• Institutional• Planning Focus Report
Effectiveness
• Strategic Plan andPlanning
• Oversight andreporting on UniversityInstitution IEffectiveness process
• Special Reports andAnalyses (CompetitorProfile, strategicinformationsummaries, etc.)
OPIRA Director for • Academic Programs (CAS • Assessment Focus
Institutional & COP) Report on all Annual
Effectiveness & University assessment• Academic Support activitiesAssessment
Programs• Assessment Task Force
• Administrative Units Report (current year)
• Administrative Support • Annual InstitutionalUnits Effectiveness Report
• TracDat Administration • SACS Reportsfor all Xavier Assessments
• Institutional Effectiveness
OPIRA Director for • Instructor Evaluations • Instructor Evaluation
Institutional Reports• SACS 4.1 StudentResearch
Achievement • SACS Reports
• Annual EnvironmentalFocus Report
25
Assessment Responsibilities at Xavier (continued)
Office Position Assessment Responsibilities Reporting
OPIRA Senior Institutional • Prepares & analyses • Survey Results ReportsResearch Analyst assessment surveys,
including InstitutionalEffectiveness Survey
• Archives assessment data
Office of Assessment • All Summer Programs, • Annual SummerAcademic Specialist including NASA Programs Report
Affairs• QEP • QEP Reports
• I Cubed • I Cubed Reports
• Freshman Seminar • NASA Report, etc.
• Teagle Foundation
• Various specialassignments
Office of Director for the • Course Portfolio Working • TracDat ReportAcademic Center for Groups, including QEPAffairs Advancement of Reading • FaCTS Reports to Mellon
Teaching Foundation
• FaCTS Grants• Internal Reports for
• Maintains Assessment Planning
Toolbox & links to
resources
• Classroom observations
and other assessment
services
26
FINAL RECOMMENDATIONS
The state of assessment at Xavier was certified as meeting core requirements and
comprehensive standards based upon the recent SACS reaccreditation. However, in carrying out its
charge, the Assessment Task Force found that units, departments, programs, and individuals have
widely varying abilities to understand, conduct, and use assessment activities appropriately. The Task
Force also found that while units, departments, and programs carried out the process of assessment
successfully to the SACS reaccreditation review, there was widespread skepticism about the utility of the
process. In focused interviews conducted as a part of carrying out its charge, the Task Force concluded
that the assessment process in many areas was not yielding the documented improvements and many
improvements were occurring outside the assessment process. These improvements were not well
documented. The recommendations from the Task Force are summarized under the two major
categories of The Assessment Process and Outcomes Assessment; within the two categories the
recommendations are prioritized as Essential (E) or should be adopted now; Short-Term (ST) or should
be adopted in one year; and Long-Term (LT) or should be adopted by the next three year assessment
cycle.
Assessment Process Recommendations
Coordination of effort and systems would improve the effectiveness of the assessment process as well
as the university’s institutional effectiveness. Resources related to conducting assessments would also
improve effectiveness. Recommendations related to the assessment process, itself, are as follows.
13. Establish a Xavier Assessment Group that serves as a monitoring and review committee forXavier assessment practices. The committee would include appointed members from academicprograms, administrative programs, and special programs as well as the Director for InstitutionalEffectiveness and Assessment, the Assessment Specialist, and the Chair of the Core CurriculumReview Committee. (LT)
14. The Director for Institutional Effectiveness and the Assessment Specialist should prepare ahandbook for assessment at Xavier that explains the assessment process at Xavier and thatincludes specific instructions for carrying out assessments. (E)
15. The Director for Institutional Effectiveness and the Assessment Specialist should develop awebsite specifically for assessment that includes the aforementioned handbook and otherresources. The website should be housed with the Office of Planning, Institutional Research andAssessment’s website. (E)
16. The cycle of five-year academic program reviews should be coordinated with the annualprogram assessments so that the program reviews incorporate the currently-assessed learningoutcomes related to the specific program being reviewed. (ST)
17. The programs with courses that are assessed in the Core Curriculum assessment process shouldcoordinate with the Core Curriculum Review by providing a method of assessing core learningoutcomes beyond the CAAP standardized test. (E)
27
18. Individual academic program units should coordinate their curriculum with their learningoutcomes by preparing a curriculum map to document which courses relate to the learningoutcomes being assessed. The TracDat system provides a tool to develop a curriculum map. (LT)
19. Planning, assessment, improvement, and institutional effectiveness should be bettercoordinated by fully implementing the available components of the TracDat software. (ST)
20. Student recruitment and retention plans formulated by the academic departments should beincorporated into TracDat outcomes assessment. (LT)
21. Smaller academic programs in similar disciplines should consider coordinating their assessmentby adopting some common outcomes that could be assessed as a larger sample. (ST)
22. Emphasize the importance of documentation in the TracDat Assessment Management System,including documentation of outcomes, measures and related criteria, and improvements basedupon analysis of data. (E)
23. Set up assessment on a continuing basis throughout the year, not just at the end of the year.(ST)
24. Adopt the recommendations from the Evaluation Group arising from the Assessment Task ForceDepartment Chair Survey conducted in Fall of 2010. (ST and LT) Details of theserecommendations can be found under the Evaluation Group section of this document.
Outcomes Assessment Recommendations
Specific recommendations related to outcomes assessment can be divided according to whether the
outcome concerns academic programs or administrative units. Recommendations for academic
programs and related support programs include the following items.
8. SACS reviewers’ comments should be used to improve assessment. (E)
9. Increase process compliance to prior levels. (E)
10. Use weaknesses identified in Summer Program assessment to improve assessment year-round.(E)
11. Have more than one assessment method for each outcome so that evidence for meeting theoutcomes will be strengthened. (E)
12. Limit the number of learning outcomes assessed in a three-year cycle to three-to four specific(knowledge, skills, abilities) outcomes that can be assessed using multiple methods. (E)
13. Prepare a curriculum map to identify courses where outcomes can be assessed using embeddedcourse assessment employing rubrics. (E)
28
14. Conduct additional training for use of TracDat, outcomes assessment, results analysis, andimprovement using face-to-face and online training. (E)
Recommendations for administrative units include the following items.
9. Increase process compliance to prior levels. (E)
10. Emphasize the assessment of unit operations and processes. (E)11. Determine the core functions of the unit that allow for reaching the goals of the office.
12. Determine how the unit interacts with other administrative and/or academic units whencarrying out its core functions. Administrative units carry out their functions across multipleoffices. (E)
13. Choose three core functions and state as outcomes for evaluation for each three-year cycle ofassessment. (E)
14. Include the Institutional Effectiveness Survey results as an additional outcome. (E)
15. Have more than one assessment method for each outcome so that evidence for meeting theoutcomes will be strengthened. (E)
16. Conduct additional training for use of TracDat, outcomes assessment, results analysis, andimprovement using face-to-face and online training. (E)
The Assessment Task Force presents these recommendations as steps in developing a process of
assessment at Xavier University of Louisiana that will insure our university mission will be met for all
students. Additionally, adoption of and successful implementation of the recommendations will
guarantee that assessment of our educational programs and administrative processes follow best
practices that encourage continuous improvement based upon careful evaluation and evidence. These
recommendations, when implemented in a systematic fashion according to the timeline presented
above, will establish a culture of assessment for continued success.
29
APPENDIX A
ASSESSMENT SUMMIT REPORT - XAVIER UNIVERSITY of LOUISIANA
June 15, 2010
Assessment Summit
Output Document
v.1
Leadership
56 Perimeter Center East, #103Atlanta, Georgia 30346
770.454.1440www.leadstrat.com
Overview
Xavier University of Louisiana held a one-day assessment summit on June 15, 2010. CynthiaWaisner from Leadership Strategies served as facilitator and documenter for the session.
Retreat Purpose and Objectives:
The purpose of the retreat was to establish a strategic framework for assessment at Xavier.
Objectives of the retreat included the following:
Reach consensus on the purpose, or mission, of assessment at Xavier;
Agree on a shared vision and long-term goals derived from that vision;
Identify critical success factors and barriers for each of the long-term goals;
Develop a list of possible strategies, or projects, for moving toward thevision, achieving critical success factors, and/or reducing or eliminatingidentified barriers;
Reach agreement on priority strategies from the list of possible strategies;and
Identify next steps, milestones, and accountabilities associated with thepriority strategies.
30
Retreat Agenda
Opening Exercise: Strategy Prioritization
Hopes and Concerns Action Plans (High Level)
Where We’re Going - The Big 6 Session ClosePicture
Mission
Vision
Li Goals
How We’ll Get There
Critical Success Factors
Barriers
Li Strategies
Documentation Contents
This document presents the results of the session as recorded by the facilitator. Commentsappearing in italic represent additions made by the documenter for clarity. The documentincludes the following sections:
I. Opening Exercise 31
II. Vision 31
III. Goals 32
IV. Mission Statement 33
V. CSFsandBarriers 33
VI. Strategies and Priority Strategies 36
VII. Action Plan 40
VIII. Next Steps 41
31
I. Opening Exercise IParticipants were asked to ident5’ their greatest hopes and concernsfor the day. Each ofthe four table groups was then asked to agree upon and report back their top hope and topconcern.
Hopes Concerns
That we create a culture of assessment that is That our assessment process is not going touseful, integrated, and coordinates all types of be integrated and useful across the
assessment across the University University
That we are able to breach the silos, connect the That we do not achieve an Institution-wide
various assessments and connect everything to focus
the institution
That we develop a realistic, integrated, usable, That we develop something that doesn’t meet
and transparent system across all levels the criteria listed to the left of this concern
That we develop a comprehensive framework That we not fully take into account Xavier’s
for assessment, moving from the general to the learning culture — can what we agree on here
more specific and that we all have a better truly be transplanted across the University?
feeling and sense of purpose regarding
assessment at the University
II. Vision IA visioning exercise was usedfollowed by breakout sessions in small groups to identify commonthemes. These themes were used to develop the mission, vision, and goals.
VISION: The vision provides a picture ofa preferredfuture state.
Proposed Vision Statement
Assessment: Driving Growth and Change
The following slogans were on post-it pages on the Green Team’s table. They are
included here for possible use as the effort goes forward:
• Assess to be your best!
• Data are yourfriend!
• Bloom and grow!
32
III. Goals 1Goals identify broad aims toward which the organization will work over along (approximately 10-year) span oftime:
BUY-IN Maximize buy-in for assessment with all stakeholders
throughout their affiliation with the University
CULTURE OF Promote a more accepting culture of assessmentASSESSMENT .
You may want to omit the qualifying words “more accepting” in your
final version
RESOURCES Provide levels of resources needed to effectively carry
out assessment
Green Team edited to read: Provide resources needed to carry out
assessment effectively
EFFECTIVE Promote an assessment model that is effective,ASSESSMENT systematic, integrated, collaborative, and used forMODEL .. .growth and poSitive change. This model will include:
. Ensuring that data collection is accurate, timely,and consistent across the University;
• Ensuring regular evaluation of the assessmentprocess and its usefulness to the University;
. Ensuring usefulness and use of the assessmentsyste ms;
. Maximizing availability of comparable data setsfor analysis;
. Ensuring transparency of assessment needs, data,and results, as well as the responsibility for theseneeds, data, and results;
. Maintaining mission-oriented focus; This wasoriginally stated as “learning-centeredfocus”; and
• Maintaining coordinated (centralized?) data.
33
IV. 1’Iission Statement
A mission defines the overallpurpose ofan organization. The mission statement should describewhat the organization does, for whom it does it, and the benefit.
Proposed Mission Statement
To systematically collect, analyze, and use relevant datafrom all levels — i.e., individual, departmental, andUniversity — in order to continuously and effectively fulfillthe mission and the overall goals of the University
Groups were asked to identfypossible edits to the above. They suggested:
• Include “ident)5’ goals”
• Fix the split infinitive and shorten it
• Add concept of “improvement”
• Move reasonfor assessment (‘fulfilling the mission “) to the beginning of the statement
The other three draft mission statements are included below. Specific language and conceptscontained in them may be useful in crafting thefinal version ofthe mission statement:
• To set goals and evaluate performance to ensure outcomes are met in order to drivegrowth and change to the benefit ofthe overall University community.
• The purpose ofassessment at Xavier is to support the mission ofthe University by:
• Developing and implementing a plan for continuous cyclical improvement;
• Documenting implementation results; and
• Using the documentation to improve the plan
• The purpose ofassessment at Xavier is to:
• Support the mission ofstudent learning;
• Identfy critical areasfor decision-making; and
• Support decision-making, growth, and improvement through the systematic collection,organization, and analysis ofinformation.
V. CSFs and Barriers I
The participants next identified Critical Success Factors and Barriers for each goal. First, groups
identified CSFs — those things which must go right, or the conditions that must be created —for the goals
and objectives to succeed. Participants next identified Barriers - those conditions which might hinder
success in achieving the objectives.
34
GOAL: BUY-IN
CRITICAL SUCCESS FACTORS
Identifying the stakeholders and their unique roles
and responsibilities
Accountability
Having a truly effective model and resources to
follow that model
Marketing plan
Value obvious to stakeholders
Simple and easy to use
BARRIERS
Misunderstanding of assessment
Lack of knowledge about assessment
Resistance to change
Available time
Perception and/or reality that effort is top-down
GOAL: CULTURE OF ASSESSMENT
CRITICAL SUCCESS FACTORS BARRIERS
35
Stakeholders understand assessment
Stakeholders have shared value for and language
of assessment
Stakeholders see the value of assessment
Strong, consistent reinforcement of best policies
Individual beliefs and mindset that pastassessment results have not been used forgrowth and change
Belief that assessment is only needed/used foradministrative purposes (e.g., SACS)
Results are not communicated consistently tostakeholders
Lack of necessary resources
Competing cultures and values
Maintenance of the status quo
GOAL: RESOURCES
CRITICAL SUCCESS FACTORS
All areas have the means for successful
assessments
BARRIERS
Assessment at “any cost” There was considerable
discussion around this point, with the end result
being clarification that this meant that assessment
does not get shunted aside by competing priorities,
but remains a high priority and area offocus “no
matter what,” i.e., “at any cost”
Identification and evaluation of required resources
Poor planning
Competing institutional priorities
Budget process is not aligned with assessment
Quality and quantity of staff
GOAL: EFFECTIVE ASSESSMENT MODEL
CRITICAL SUCCESS FACTORS BARRIERS
36
The model is useful at all levels Temptationltendency to do what is easy vs.what is meaningful
Data are good and of a quality that is responsive
to users and stakeholders Fear of exposing weakness
Standard data templates are used where Territorial attitude towards data
appropriate
Responsible, trained personnel are assigned to
each assessment
There is stakeholder knowledge regarding the
data that are already available
VI. Strategies and Priority Strategies IParticipants were next asked to brainstorm possible strategies for each of the goal areas, as wellas to assign each “potential strategy” generated earlier in the day to one of the goal areas.Participants were asked to generate strategies that: directly affected achievement ofthe goal;strengthened, leveraged, or put into place a CSF; and/or weakened, removed or lessened theimpact ofa barrier.
Once strategies were generated and reviewed, participants were asked to “vote,” using dots, to establish
priority strategies. Each participant was given four dots per goal area to assign as he or she wished.
Results for each goal area, in decreasing order of priority, are provided below.
GOAL: BUY-IN
STRATEGY TOTAL POINTS
AWARDED
37
Develop an incentive program that highlights successful assessment efforts 18
combined with Conduct a festival of assessment to report successes and issues
Develop a campus-wide marketing plan for assessment 17
Identify the stakeholders and their roles and responsibilities 16
Conduct focus groups to solicit input and integrate their input into the model
At the bottom of this post-it, someone added the comment “Top down”10
Develop a professional development plan for stakeholders 9
Identify resistant stakeholders and work with them combined with Invite them
to the festival and team them with assessment ambassadors5
In the discussion, the point was made that even though this was not a top vote
getter, it should be incorporated early into the plans and actions for
assessment. Resistant stakeholders may have a lot of influence, and failing to
address this early on could make the initiative more difficult or even sideline it
altogether
Compulsory entrance and exit evaluations for students and employees 1
Regularly survey stakeholders and develop a strategy to determine the 1
effectiveness of the model
Hold (monthly?) seminars: define outcomes, measurement, focus groups? 1
Communicate the work of this group and the barriers we have identified to 0
implementation and acknowledge problems in past systems
38
GOAL: CULTURE OF ASSESSMENT
STRATEGY TOTAL POINTS AWARDED
Assessment training and education 22
Develop incentive system for assessment 15
Some participants felt that this should be combined with the
strategy listedfirst, for a combined total of 37. This strategy also
appeared in the “buy-in” category — when the totals from the two
categories are combined, there are 33 points for this strategy. If you
combine it with “assessment training and education,” the total
jumps to 55.
Add an assessment link at main Xavier web page 12
Develop a system for dissemination of results to all stakeholders 10
annually
Promote scholarship of assessment to faculty 9
Develop video of success stories 5
Publish vision and mission of assessment on a document that 4
includes the goals of assessment
Recognition for assessment — need both sticks and carrots 1
Create a video that describes the new assessment system 0
Food, entertainment, daily activities 0
GOAL: RESOURCES
39
STRATEGY TOTAL POINTS AWARDED
All-Star Assessment Team combined with Identify personnel 25
combined with Identify who does what with institutional assessment
Identify needs: personnel, equipment, know-how, and training 23
combined with Identify all institutional resources that support
assessment
Align budgetary allocations with plan 12
Appoint a separate QEP evaluator as stated in the QEP budget 9
Engage University Planning Council to elevate assessment as a 5
University priority
Adhere to the strategic plan combined with plan, plan, plan! 5
Integrate grant-funded assessment into general assessment plan 3
Attempt to identify new, external resources, especially for initial 1
costs of developing/launching new system
GOAL: EFFECIIVE ASSESSMENT MODEL
STRATEGY TOTAL POINTS AWARDED
Look at best practices and identify which components could work at 30
Xavier combined with Evaluate current model — what is useful
(strengths and weaknesses)
Develop official, central hub of information —who is responsible
combined with For each of the University’s priorities, common or
40
compatible departments across campus develop objectives,
benchmarks, and a timeline. They meet regularly to discuss SWOT.
A note connecting the two strategies read “All Star Assessment 12
Team.” See notes in Action Plan regarding two visions for this
assessment team
KPls (Key Performance Indicators) 11
Invest in technology that will streamline processes 10
Develop uniform data sets with consistent variable names, based on 8
institutional data
University defines what assessment means 6
Develop organizational plan 4
1) Identify characteristics of overall assessment that all can agree
on; 2) identify systems that need to be integrated; 3) make
recommendations concerning next steps, actions, timelines; 4) 1
identify resources needed to carry out the plan
Develop a plan for coordinating efforts across the campus (who is 0
responsible, who is accountable?)
Have each unit or department input their data on a timely basis 0
VII. Action Plan IAfter ident)5’ing the priorities, the group determined that thefirst step needed to be creation andcharging ofthe “All-Star Assessment Team.” There were two separate visions ofwho would bepart ofthis group and what its primary task would be. Some saw the group as having a highdegree ofassessment subject matter expertise and doing the early analysis regarding who iscurrently doing what, what resources are available and needed, etc. The group would then makerecommendations and open up the discussion to a broader, more inclusive group, eventuallyreportingfindings and making recommendations to the Planning Council. Others saw the groupas a more representative and inclusive body ofthose involved in assessment, meeting regularly
41
to share issues and expertise, concerns, etc. After some discussion, the group decided that theformer model should be thefocusfor this first group, with the understanding that thedevelopment ofthe more inclusive working committee will come at a later date.
This group will be a committee ofSMEs in assessment. It will be led by Dr. Brookover, who willidentb5’ and invite other members. It was noted that it will be important to have representationfrom the non-academic side ofthe University. Recommendations are being developed regardingthis appointment. The committee wilifunction, in a sense, as a body ofinternal consultants. Itsworking process may include thefollowing steps and stages:
1) map and evaluate the current system;
2) identO5’ best practices;
3) identify strengths and weaknesses of the current system;
4) develop a preliminary set ofrecommendations;
5) seek input regarding these initial recommendation; and
6) report to the Planning Council.
The specific steps and working process will be defined by the committee. Next steps andresponsibilities, including thosefor ensuring high-level support, are included in the next section.
Vu!. Next Steps IThefollowing action items and responsibilities were identUled during the course ofthe retreat.
Action Assigned to Due Date
1. Complete documentation from the retreat C. Waisner 6/18/10
2. Wordsmith draft mission, vision, and goal statements R. Durnford 6/22/10and circulate for comments
3. Develop “one-pager” that outlines mission, vision, R. Durnford As determined bygoals? (Possible follow-up item) Committee
4. Identify membership for “All-Star” committee C. Brookover 6/22/10
5. Hold first meeting of “All-Star” committee. Purpose: To C. Brookover 7/1/10develop charge and working plan
6. Prepare announcement to University community L. Blanchard 9/1/10reporting on outcome of this meeting, establishment of and/or N. Francis“All-Star” committee, and indicating high-level supportand commitment to this initiative
42
Tab
le1
Whi
chof
the
foll
owin
gar
ein
clud
edin
your
asse
ssm
ent
plan
?
APP
EN
DIX
B
Res
ults
from
the
Eva
luat
ion
Gro
up
Cha
irS
urve
yin
Tab
les
Stu
den
tS
enio
rN
atio
nal
Pos
t-gr
ad
Acc
redi
tati
onL
earn
ing
Dep
artm
ent
Dep
artm
ent
Uni
vers
ity
Stu
den
tC
omps
Exa
mG
radu
atio
nJo
bT
rain
ing
Dep
artm
ent
Req
uire
men
tsO
utco
mes
Mis
sion
Goa
lsM
issi
onP
rogr
essi
onR
esul
tsR
esul
tsR
esul
tsP
lace
men
tP
lace
men
tO
ther
Bio
logy
XX
Bus
ines
sX
XX
Che
mis
try
XX
Com
pute
rSc
i.X
XX
XX
Edu
cati
onX
XX
XX
X
Eng
lish
XX
X
His
tory
X
Lan
guag
eX
XX
X
Mat
hX
XX
Phi
loso
phy
XX
X
Pol
itic
alS
cien
ceX
XX
XX
X
Phys
ics
XX
Soc
iolo
gyX
XX
The
olog
yX
XX
XX
XX
43
Table 2 - Which of the following is involved in developing the assessment plan for your
department/division?
Designated Designated All orFaculty Faculty Adminis- Other
Department Person Group trators Alumni Chairperson Staff Students Faculty
Biology X X X
Business X XChemistry X X X
Computer Science XEducation X X X X X
English X XHistory X*
Language X X X
Math X X
Philosophy X X
Political Science X
Physics XSociology X
Theology X
Former Chair
Table 3 - Who of the following is involved in data collection related to the assessment plan for your
department/division?
Designated DesignatedFaculty Faculty Adminis- All
Department Person Group trators Alumni Chairperson Staff Students Faculty
Biology X
Business X X X
Chemistry XComputer Science XEducation X X
English X X
History A
Language X X
Math X X
Philosophy X X
Political Science X
Physics XSociology X X
Theology X
A - Whoever teaches History 4415
Table 4 - Who of the following is involved in discussing/interpreting assessment results for your
department/division?
Designated DesignatedFaculty Faculty Adminis- All
Department Person Group trators Alumni Chairperson Staff Students FacultyBiology X X XBusiness X XChemistry X XComputer Science XEducation X X X X XEnglish X X XHistory XLanguage AMath X X XPhilosophy X XPolitical Science X APhysics X XSociology X XTheology XA - No one has been designated.
Table 5 - What priority does assessment take in relation to other departmental duties?
Community UniversityDepartment Advising Assessment Service Scholarship Teaching Service OtherBiology 5 4 2 6 7 3Business 6 3 2 4 7 5Chemistry 5 2 3 6 7 4Computer 5 3 2 6 7 4ScienceEducation 7 6 3 4 5 2
English 4 3 2 5 7 6
History 5 1 2 4 6 3Language 7 3 4 5 6 2
Math 6 4 2 5 7 3Philosophy 4 2 1 6 7 5 3Political Science 5 3 1 4 6 2 7*
Physics 5 3 2 6 7 4
Sociology 6 4 3 2 7 5
Theology 5 4 2 6 7 3Overall Average 5.35 3.21 2.21 4.93 6.64 3.64
• Missing *balancing all of the departmental duties NOTE: Highest priority is 7, and lowest priority
is 1. The rankings are for individual departments, and the overall composite is computed from all
departments reporting.
44
45
APPENDIX C
Best Practices Resources
Best Practices — General
Alverno University publications
http://www.alverno.edu/for ed ucators/publications.html#sa
American Psychological Association. Assessment Cyberguide.
http://www.apa .org/ed/governance/bea/assessment-cyberguide-v2.pdf
This voluminous work is an excellent assessment publication.
Council of Regional Accrediting Commissions. (2003). RegionalAccreditation and Student
Learning: Principals for Good Practices.
This short publication gives a summary of good practices in assessment from the six regional accrediting
commissions.
Dwyer, C. A., Millet, C. M., & Payne, D. G. (June, 2006), A Culture of Evidence: Postsecondary
Assessment and Learning Outcomes. Educational Testing Service: Princeton, NJ
This issue paper from ETS gives guidelines for college assessment at the institutional level and calls for
the six regional accrediting agencies to set up a national system which incorporates common measures
for student learning. A section that describes fair and psychometrically sound testing is included in the
paper.
Middle States Commission on Higher Education
http://www.msche.org/publications view.asp?idPublicationType=5&txtPublicationType=G uidelines+for
+lnstitutional+lmprovement
http://www.msche.org/publications/examples-of-evidence-of-student-learning.pdf
These publications present a detailed, well-written account of all aspects of the assessment process.
North Carolina State Universityhttp://www2.acs.ncsu.edu/UPA/assmt/Guide Principles.htm
http://www2.acs.ncsu.edu/UPA/assmt/best practice stmt.htm
http://www.ncsu.edu/uap/academic-standards/ua pr/process/la nguage.html
Pennsylvania State University (Penn State). Innovationsinsight series.
46
This series of articles on assessment is one of the best overall resources for assessment best practices.
Links to some of these articles are shown below.
http://www.psu.edu/president/pia/innovation/creating a culture.pdf
http://www.psu.edu/president/pia/innovation/relationship between%20continuous improvement.pdf
http://www.psu.edu/president/pia/innovation/benchmarking for innovation.pdf
http://www.psu.edu/iresident/pia/innovation/facilitating teams.pdf
http://www. psu.ed u/r3residenthjia/innovation/Using surveys for data collection in continuous impr
ovement.pdf
http://www.psu.edu/president/pia/innovation/improve7.pdf
http://www.psu.edu/president/ria/innovation/integrating pla nning.pdf
http://www.psu.edu/president/pia/innovation/strategic indicators.pdf
http://www.psu.edu/president/pia/innovation/tools for organizational improvement.pdf
http://www.psu.edu/president/pia/innovation/Leading for Continuous Improvement v2.pdf
Stasson, M.,Doherty, K. & Poe, M. Principles of Good Practice for Assessing Student Learning.
Office of Academic Planning and Assessment, University of Massachusetts — Amherst.
This 62 page handbook contains extremely useful information about the assessment process.
Shulman, L. S. (January/February, 2007). Counting and Recounting: Assessment and the Questfor Accountability. Change.
This article discusses the seven pillars of assessment for accountability, which include multiple measures
and embedding assessment into courses, among others.
University of Central Florida. The Administrative Unit Handbook: Measuring Student Support
Services and Administrative Outcomes.
This 41 page handbook specifically addresses assessment of administrative units. It provides good
information on the entire assessment process.
http://oeas.ucf.edu/doc/adm assess handbook.pdf
University of Texas — Arlington. (Spring 2010). Unit Assessment Handbook.
This 91 page handbook covers every aspect of assessment and includes many examples for both
academic and administrative assessment.
47
North Carolina State University, Internet Resources for Higher Education Outcomes Assessment,
http://www2.acs.ncsu.edu/U PA/assmt/resource.htm
This website contains the most comprehensive list of and links to college level outcomes assessmentavailable. It was last update on February 7, 2011.
Culture of Assessment
Lakos, A. & Phipps, S. (2004). Creating a Culture of Assessment: A Catalyst for OrganizationalChange. Portal: Libraries and the Academy, 4(3). Johns Hopkins University Press: Baltimore, MD. 345-361.
This article discusses the history of a cultural of assessment in terms of organizational change andspecifically applies it to university libraries.
Piascik, P. & Bird, E. (2008). Evaluation, Assessment, and Outcomes in Pharmacy Education: The2007 AACP Institute—Creating and Sustaining a Culture of Assessment.American Journal ofPharmaceutical Education, 72(5), 1-9.
This article describes the development and implementation of an assessment program at the College ofPharmacy at the University of Kentucky. Successes and challenges are discussed.
Roberts, Anderson, Bird, and Cain. Creating a Culture ofAssessment: Closing the Loop. Universityof Kentucky, School of Pharmacy, pdf of a power point presentation.
This power point presentation describes the steps in creating a culture of assessment at the UK Schoolof Pharmacy. It emphasizes the collaboration between faculty, students, and other stakeholders in theassessment process.
Suggested Readings on Encouraging Faculty Engagement in Assessment— pdf chart
This chart provides an extensive bibliography on obtaining faculty buy-in to the assessment process.
Measurement and Evidence
Examples of Evidence of Student Learning — pdf chart, Middle States Commission on HigherEducation.
This handy chart gives numerous examples of direct, indirect, and evidence of student learning and oflearning processes that promote student learning.
Suski, L. (2006). The Role of Published Tests and Assessments in Higher Education — pdf chart -
Middle States Commission on Higher Education.
This short article compares the three standardized measures in higher education: ETS Measure ofAcademic Proficiency and Progress (MAPP), ACT Collegiate Assessment of Academic Proficiency (CAAP),and Council for Aid to Education Collegiate Learning Assessment (CLA).
48
Using Assessment for Improvement
Pennsylvania State University Office of Planning and Institutional Assessment. (2005) Assessing
for Improvement. Innovation insights, 11.
http://www.psu.edu/president/pia/innovation/Assessing for Improvement 2.pdf
This article from PSU’s Innovation insights series explains the reciprocal relationship between
assessment and quality management.
49
APPENDIX 0
E-mail Survey for the Mapping Group
Xavier University of Louisiana
Assessment Task Force
Xavier University has undertaken an important exercise to map activities related to assessment that are
occurring throughout this institution. The goal isto gain an understanding of where assessment is
taking place and who the responsible individuals are for completing these activities.
You have been identified as being a P1 on a funded project. Please assist us with our efforts by
completing this brief survey related to the assessment of your funded project.
Project Title:
Funding Agency:
Department / Division:
P1:
Individual(s) Responsible for Assessment / Evaluation:
Affiliation:
Is an external evaluator required for this project by the funder? [ ] YES [ ] NO
How frequently are assessment / evaluation reports required to be produced?
If reports are required, during which month are they due?
What is required in your reporting
Description of project activities / implementation [ I YES [ I NO
Description of project participants / subjects (human) [ ] YES [ I NO
Evaluation of outcomes / results [ I YES [ I NO
Fiscal reporting [ I YES [ I NO
Thank you for your assistance. If available, please provide us with an electronic copy of your required
assessment report. Again, thank you.
50
APP
EN
DIX
E
Map
ping
Gro
upS
urve
y
Fun
ding
Age
ncy
Cam
pus
P1In
divi
dual
(s)
Res
pons
ible
?Is
anH
owfr
equ
entl
yar
eD
escr
ipti
onof
Des
crip
tion
Eva
luat
ion
Fisc
alD
epar
tmen
t?E
valu
atio
nex
tern
alas
sess
men
t?ev
alua
tion
pro
ject
ofp
roje
ctof
Rep
orti
ngD
ivis
ion
eval
uat
or
rep
ort
sre
quir
edto
beac
tivi
ties
?p
arti
cip
ants
?o
utc
om
es?
requir
edp
rod
uce
d?
imp
lem
enta
tio
nsu
bje
cts
resu
lts
(hum
an)
NIH
SCO
RE
Che
mis
try
Gal
ina
Gol
over
daD
r.S
haw
nD
rew
Ann
uall
yY
esY
es
U.S
.D
ept.
ofC
omm
unic
atio
nsN
ancy
Mar
tino
Nan
cyM
arti
noY
esA
nnua
lly
Yes
No
Yes
Yes
Edu
cati
on,
Off
ice
ofD
ept.
Spe
cial
Edu
cati
onP
rogr
ams
Lou
isia
naB
oard
ofR
C/
EEP
Mau
reen
Shu
hM
aure
enS
huh
Yes
Qua
rter
ly,
Ann
ually
,A
ten
dof
Yes
No
Yes
Yes
Reg
ents
fund
ing
cycl
ean
dFo
r20
10,
we
had
tore
-wri
tean
dre
subm
ital
lth
ere
port
sfr
om20
07-2
010
Lou
isia
naB
oard
ofC
ompu
ter
Sci
ence
Kun
Zha
ngN
AN
oA
ten
dof
fund
ing
cycl
eY
esY
esY
esY
esR
egen
ts
NIH
,su
bco
ntr
acte
dC
ompu
ter
Sci
ence
Kun
Zha
ngN
AN
oQ
uart
erly
,A
ten
dof
fund
ing
Yes
Yes
Yes
Yes
from
Tul
ane
Can
cer
cycl
e
Cen
ter,
P1at
Tul
ane:
Dr.
Erik
Fle
min
gton
Nat
iona
lS
cien
ceC
hem
istr
yL
amar
tine
Med
aR
ose
Sha
wY
esQ
uart
erly
Yes
Yes
Yes
Yes
Fou
ndat
ion
NIH
Che
mis
try
Gen
eD
Am
our
Mic
hell
eS
olim
an,
Pro
gram
Yes
Ann
uall
yan
dIn
tern
alre
port
sY
esN
oY
esY
esM
anag
er;
Dr.
Gua
ngdi
are
prod
uced
each
qu
arte
r,bu
tW
ang,
Pro
gram
Dir
ecto
rth
epr
ogre
ssre
port
iddu
eto
the
NIH
annu
ally
Lou
isia
naS
pace
Off
ice
ofR
esou
rce
Rac
hel
Cru
thir
dsH
uaM
el!
Che
mis
try
and
No
At
end
offu
ndin
gcy
cle
Yes
Yes
Yes
Yes
Con
sort
ium
Dev
elop
men
tR
ache
lC
ruth
irds
/R
esou
rce
Dev
elop
men
t
Nat
iona
lIn
stit
ute
ofO
ffic
eof
Res
ourc
eG
ene
D’A
mou
rG
ene
DA
mou
ran
dR
ache
lN
oA
nnua
lly
Yes
Yes
Yes
Yes
Hea
lth
Dev
elop
men
tC
ruth
irds
-O
ffic
eof
Res
ourc
eD
evel
opm
ent;
Kar
enZ
hang
-C
ompu
ter
Sci
ence
;M
aure
enS
huh
—
Pha
rmac
yN
SFC
ente
rfo
rU
nder
grad
Kat
hlee
nM
orga
nK
athl
een
Mor
gan
No
Ann
uall
yY
esY
esY
esY
esR
esea
rch
U.S
.D
epar
tmen
tof
Off
ice
ofA
cade
mic
Dam
onW
illi
ams
Dam
onW
illia
ms,
Mon
ica
Yes
Ann
uall
yY
esY
esY
esE
duca
tion
Aff
airs
Maj
ors
51
Fun
ding
Age
ncy
Cam
pus
Dep
artm
ent/
P1In
divi
dual
(s)
Res
ponsi
ble
/Is
anH
owfr
equ
entl
yar
eD
escr
ipti
onof
Des
crip
tion
Eva
luat
ion
Fisc
al
Div
isio
nE
valu
atio
nex
tern
alas
sess
men
t!ev
alua
tion
pro
ject
ofp
roje
ctof
Rep
orti
ngev
aluat
or
report
sre
quir
edto
beac
tiv
itie
s!par
tici
pan
ts!
outc
om
es!
requir
edp
rod
uce
d?
imp
lem
enta
tio
nsu
bje
cts
resu
lts
US
.D
epar
tmen
tof
Off
ice
ofR
esou
rce
Olg
erC.
Tw
yner
,O
lger
C.T
wyn
er,
IllN
oS
emi
Ann
ually
,A
ten
dof
Yes
Yes
Yes
Edu
cati
onD
evel
opm
ent
Illfu
ndin
gcy
cle
US
.D
epar
tmen
tof
Off
ice
ofR
esou
rce
Olg
erC.
Tw
yner
,O
lger
C.T
wyr
ier,
IllN
oSe
mi
Ann
ually
,A
ten
dof
Yes
Yes
Yes
Edu
cati
onD
evel
opm
ent
Illfu
ndin
gcy
cle
Dep
artm
ent
ofth
eB
asic
Pha
rmac
euti
cal
Rob
ert
Bla
keII
Rob
ert
Bla
keII
No
Ann
ually
,A
tth
een
dof
fund
ing
Yes
Yes
Yes
Yes
Arm
y(A
rmy
Res
earc
hS
cien
ces,
Col
lege
ofP
harm
acy
cycl
e
Off
ice)
AT
SDR
/AM
HPS
Che
mis
try!
Art
san
dS
cien
ces
Gua
ngdi
Wan
gN
oSe
mi
Ann
uall
yY
esN
oN
oY
es
NIH
Che
mis
try
Che
ryl
Ste
vens
Che
ryl
Ste
vens
No
Ann
uall
yY
esY
esY
esY
es
HR
SAD
CA
SD
r.K
eish
aW
atso
nis
Dat
aY
esSe
mi
Ann
uall
yan
dM
onth
lyY
esY
esY
esY
es
Man
ager
!E
valu
ator
Act
ivity
Rep
orts
wer
ere
quir
ed
NSF
Bio
logy
Dr.
Syed
Dr.
Dia
naA
nder
son
No
Ann
uall
yY
esY
esY
esN
o
Mun
iruz
zam
an
Sha
wE
nivi
ronm
enta
lC
OP
DB
PST
hom
asW
iese
Sha
wan
dU
SEPA
No
Ann
uall
yan
da
Wee
kly
data
Yes
Yes
Yes
Yes
sub
ofU
SEPA
report
LCRC
CO
PD
BPS
Kita
niP
arke
r-X
ULC
RCln
tA
dvB
oard
No
Sem
iA
nnua
lly
Yes
Yes
Yes
Yes
John
son
US
DO
EE
duca
tion
Dr.
Ros
alin
dH
ale!
Dr.
Pegg
yK
irby,
Ed.
Cet
.Y
esSe
mi
Ann
uall
yan
dA
ten
dof
Yes
Yes
Yes
No
Dr.
Ren
eeA
kbar
fund
ing
cycl
e
US
DO
EE
duca
tion
Dr.
Will
iam
Dr.
Pegg
yK
irby,
Ed.
Cet
.Y
esSe
mi
Ann
uall
yan
dA
ten
dof
Yes
Yes
Yes
No
Sha
rpto
n(U
NO
)!fu
ndin
gcy
cle
Dr.
Ren
eeA
kbar
NA
SAD
ivis
ion
ofE
duca
tion
Dr.
Ros
alin
dH
ale
Dr.
Ros
alin
dP
ijea
uxH
ale
No
At
end
offu
ndin
gcy
cle
Yes
Yes
Yes
Yes
and
Ms.
Kim
Che
rry
Wal
lace
Fou
ndat
ion
Div
isio
nof
Edu
cati
onD
r.R
enee
Akb
arD
r.R
enee
Akb
aran
dD
r.N
oA
ten
dof
fund
ing
cycl
eY
esY
esY
esY
es
Thr
ough
the
Lou
isia
naR
osal
ind
Pij
eaux
Hal
e
Dep
t.of
Ed.
Wal
lace
Fou
ndat
ion
Div
isio
nof
Edu
cati
onD
r.R
osal
ind
Dr.
Ros
alin
dP
ijea
uxH
ale
No
At
end
offu
ndin
gcy
cle
Yes
Yes
Yes
Yes
Lou
isia
naD
ept.
ofEd
.P
ijea
uxH
ale
U.S
.D
epar
tmen
tof
Div
isio
nof
Edu
cati
on&
Dr.
Ros
alin
dD
r.R
osal
ind
Pij
eaux
Hal
e,N
oA
ten
dof
fund
ing
cycl
eY
esY
esY
esY
es
Edu
cati
onD
epar
.of
Com
mun
icat
ions
Pij
eaux
Hal
e!D
r.D
r.N
ancy
Mar
tino
,an
dM
s.N
ancy
Mar
tino
Ahd
ijaD
onat
to
U.S
.D
epar
tmen
tof
Res
ourc
eD
evel
opm
ent
Gen
eD
Am
our
Gen
eD
Am
our
No
Ann
uall
yY
esY
esY
esY
es
Agr
icul
ture
Off
ice
ofN
aval
Res
earc
hR
esou
rce
Dev
elop
men
tG
ene
DA
mou
rG
ene
DA
mou
rN
oA
nnua
lly
Yes
Yes
Yes
Yes
NO
AA
Res
ourc
eD
evel
opm
ent
Ald
enR
eine
Ald
enR
eine
No
Ann
uall
yY
esY
esY
esY
es
n.jLfl