program review in academic departments

18
This chapter discusses the responses of academic departments in a private research university to the institution3 program review process and recommendations. Program Review in Academic Departments Lisa A. Mets Program review is pervasive in American institutions of higher education. Barak (1982) found that 82 percent of responding colleges and universities and most higher education boards have some form of program review. They are motivated by at least one of four objectives: to improve programs; to aid selection, certification, or accountability; to increase awareness or to sell a pro- gram’s importance; or to exercise authority (Barak and Breier, 1990). There is a rich literature on program review. The literature provides sage advice on how to design and implement an effective program review. However, except for the work cited in this volume, there is scant literature on the use of the information collected through program review and the implementation of findings or recommendations. Starting in the late 1970s, numerous conference presentations, articles, and books address various aspects of program review. Although by their years of publication it may seem that these materials are dated, published monographs that continue to be particularly helpful include the following: frameworks for program evaluation (Conrad and Wilson, 1985; Feasley, 1980; and Gardner, 19771, suggestions for how to design and conduct a program review (Barak, 1982; Barak and Breier, 1990; Conrad and Wilson, 1985; Cranton and Legge, 1978; Craven, 1980; and Wilson, 19821, and research reports of institutional experiences with program review (Barak, 1982; Breier, 1986). The link between program review, planning, and change has been explored by few (Arm and Poland, 1980; Barak, 1986; and Kells, 1980). Today, new techniques have been adopted in higher education adminis- tration and management. Most recently, discussions of institutional experiences implementing total quality management and continuous quality improvement methods took center stage at annual meetings of the major higher education NEW DIRECTIONS FOR INSTITUTIONAL RESEARCH. no. 86. Summer 1995 0 Jassey-Bass Publishers 19

Upload: lisa-a-mets

Post on 06-Aug-2016

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Program review in academic departments

This chapter discusses the responses of academic departments in a private research university to the institution3 program review process and recommendations.

Program Review in Academic Departments Lisa A. Mets

Program review is pervasive in American institutions of higher education. Barak (1982) found that 82 percent of responding colleges and universities and most higher education boards have some form of program review. They are motivated by at least one of four objectives: to improve programs; to aid selection, certification, or accountability; to increase awareness or to sell a pro- gram’s importance; or to exercise authority (Barak and Breier, 1990).

There is a rich literature on program review. The literature provides sage advice on how to design and implement an effective program review. However, except for the work cited in this volume, there is scant literature on the use of the information collected through program review and the implementation of findings or recommendations. Starting in the late 1970s, numerous conference presentations, articles, and books address various aspects of program review. Although by their years of publication it may seem that these materials are dated, published monographs that continue to be particularly helpful include the following: frameworks for program evaluation (Conrad and Wilson, 1985; Feasley, 1980; and Gardner, 19771, suggestions for how to design and conduct a program review (Barak, 1982; Barak and Breier, 1990; Conrad and Wilson, 1985; Cranton and Legge, 1978; Craven, 1980; and Wilson, 19821, and research reports of institutional experiences with program review (Barak, 1982; Breier, 1986). The link between program review, planning, and change has been explored by few (Arm and Poland, 1980; Barak, 1986; and Kells, 1980).

Today, new techniques have been adopted in higher education adminis- tration and management. Most recently, discussions of institutional experiences implementing total quality management and continuous quality improvement methods took center stage at annual meetings of the major higher education

NEW DIRECTIONS FOR INSTITUTIONAL RESEARCH. no. 86. Summer 1995 0 Jassey-Bass Publishers 19

Page 2: Program review in academic departments

20 USING ACADEMIC PROGRAM REVIEW

associations. Nevertheless, annual meetings indicate that there is still a great deal of interest in program review. Increasingly, one finds that institutional researchers are approaching program review from a new perspective: it is becoming an important mechanism for quality improvement and institutional change. This chapter discusses some of the results of a systematic research pro- ject at a private midwestern research university that uses program review to stimulate the improvement of academic and administrative departments.

Program Review Process The program review process at the institution under study has three unique aspects. The first relates to changes in leadership in key administrative posi- tions. A new president came to the institution in 1985. This change was fol- lowed by changes in other key administrative positions (provost, vice president for research and dean of the graduate school, vice president for administration and planning, and deans and associate deans). In central administrative posi- tions, only the provost was an internal candidate. All other central adminis- trative positions were filled by external candidates. The deans’ and associate deans’ positions were filled by internal candidates.

The second relates to the impetus for program review. According to oral reports, members of the faculty’s governing bodies expressed to the president a need to change the way the institution was managed. For a number of years, the institution had been experiencing financial difficulty, and the faculty saw the administration making budget decisions without their input. For several years, the policy advisory committee (PAC) of the faculty had recommended a proce- dure of academic and administrative program review at the institution in order to collect faculty input, but the former administration did not act on the recom- mendation. At the urging of the PAC, the new president and his administration agreed to introduce a comprehensive program review process. Members of the faculty worked with administrators to draft the program review procedures.

The third unique aspect relates to the purpose of program review. In this institution, program review was designed to stimulate program planning and improvement. The institution considered itself to be in a steady state in terms of the sizes of its enrollment and faculty. This meant that the institution would not expand its overall enrollment, and any increases in the size of the faculty in one department would be offset by decreases in another department. Pro- gram review was not being undertaken for the purposes of retrenchment or downsizing; it was to provide information to help the institution set priorities, make decisions regarding the reallocation of resources, and improve the qual- ity of its programs and services. In other words, [he institution had ambitions to transform itself, and the faculty and administration were willing to commit to the program review process as a mechanism to facilitate this transformation. The following paragraphs list the key elements of the program review as it was designed for the first review cycle (Office of the Vice President for Adminis- tration and Planning, 1989).

Page 3: Program review in academic departments

PROGRAM REVIEW I N ACADEMIC DEPARTMENTS 2 1

Program Review Council. A program review council is appointed yearly. Faculty and administrators are invited to serve on the council. The number of members each year equals the number of units scheduled for review. Members have rotating terms. Each member chairs a subcommittee appointed to review individual units.

Departmental Self-study. In spring of the year before its review, the depart- ment is notified that it will be reviewed and that it should prepare a self-study In this self-study, the department identifies its strengths and weaknesses and pro- poses a plan for improvement. The self-study is due in the following January

External Reviewers. The department proposes names of external review- ers, who will be brought to campus for a two-day visit in the spring. The administration reviews the names and extends invitations to bring two (some- times three) reviewers. The reviewers write evaluative reports with recom- mendations and these are combined into one report to be shared with the unit under review and with key administrators.

Program Review Council Subcommittee. A subcommittee, chaired by a program review council member, is appointed to review each individual unit. A subcommittee usually consists of three members: two members from units unrelated to the unit under review and one member from a related unit. The subcommittee reviews the self-study and the reports of the external reviewers; meets with members of the department and related units, key administrators, and the external reviewers; and writes a report incorporating its findings from these sources. The report becomes the basis on which all subsequent program review decisions are made.

Administrative Meetings. A series of meetings are held following the completion of the subcommittee’s report and the review of that report by the program review council:

The subcommittee meets with the president and other key administrative offi- cers to present its findings and recommendations regarding the unit reviewed. Based on this report, agreements and commitments are developed regarding recommendations.

The president and key administrative officers meet with the dean or vice pres- ident in charge of the unit.

The key administrative officers (without the president) meet with the dean or vice president and unit head.

The dean or vice president and unit head meet with the department. In each of these meetings, an understanding of the agreements and commitments regarding the implementation of program review recommendations is artic- ulated.

Annual Updates. Each year, deans and vice presidents are asked to share with the vice president for administration and planning unit progress in imple- menting program review recommendations. This progress is then communi- cated to the program review council and the university trustees.

Page 4: Program review in academic departments

22 USING ACADEMIC PROGRAM REVIEW

Two-Year and Four-Year Follow-Up Reports. Units biennially submit follow-up reports to central administration regarding their progress in imple- menting program review recommendations. In these reports, units identify the outcomes of their program review recommendations and any difficulties they have implementing recommendations. Central administration reviews the progress of the units based on these reports and re-evaluates priorities and commitments.

The institution is now in its second cycle of program review, and the process has been streamlined for the unit. To prepare its self-study, the unit highlights accomplishments from its first review, identifies issues from the first review that are still relevant in the second review, and introduces new issues that have emerged after the first review. These issues form the basis of the review. This recognizes that program review in the second cycle is not a pro- gram review de novo; program review is a continuous process that is part of an on-going planning process.

.

Background The first units were reviewed in 1985-86 (Year 1). The first cycle of review of all administrative and academic units was completed in 1990-91 (Year 6). The university’s board of trustees underwent program review during 1991-92 while the university reviewed its experience during the first cycle and established revised procedures for the second cycle.

During the first program review cycle, it appeared that units differed in their approaches to the program review process and in their success in imple- menting program review recommendations. In other words, units appeared to behave differently both during and after their reviews. Some units appeared to successfully implement more recommendations, whereas other units appeared less successful at implementation. These differences raise the following ques- tion addressed in this study: What factors influence how departments imple- ment program review recommendations?

Methodology This research was conducted as a case study A more complete description of the methodology is described in Mets (in press). The thirty-seven departments identified for the study are from the College of Arts and Sciences ( N = 24), School of Speech ( N = 5 ) , and School of Engineering (N = 8). (Fifty-four peo- ple were interviewed, representing thirty-six of the thirty-seven invited depart- ments. The unrepresented department was dropped from this study, reducing the numbel- to thirty-six departments.)

Multiple sources of data were collected. They include focused interviews with the central administrative staff ( N = 2, or 100 percent of those invited to participate), focused interviews with the deans and appropriate associate deans

Page 5: Program review in academic departments

PROGRAM REVIEW IN ACADEMIC DEPARTMENTS 23

of the schoolskolleges (N = 6, or 100 percent of those invited to participate), focused interviews with department chairs who served during a department’s year of review (when available) and all subsequent department chairs appointed up to 1992-93 (N = 54, or 93 percent of those invited to participate), and analysis of documents. These documents included central administration doc- uments, program review reports, department self-studies, department plans, correspondence, and minutes of meetings regarding program review.

Two interview protocols were designed: one for interviews with administra- tors and one for interviews with department chairs. Interview protocols, com- prising mainly open-ended questions, guided interview sessions to ensure systematic coverage of topics. Interviewees were encouraged to express their opinions and judgments and to digress if they felt the issues were relevant. The interview guide for administrators elicited their perceptions of strengths and weaknesses of the program review process, the array of departmental problems identified through program review, strategies departments developed to imple- ment recommendations, differences among departments regarding program review, the effect of program review on their relationship with other administra- tors, outcomes of program review, the success of program review to stimulate planning and program improvement, and suggestions for changing the program review process to facilitate departmental implementation of recommendations. The interview guide for department chairs focused on the department’s response to program review and the resulting recommendations; strategies the department developed to implement recommendations (what worked or did not work, what facilitates implementation, and what obstacles exist); explanations why particu- lar recommendations have not been implemented; changes in the department thought to result from program review recommendations; the impact of program review on departmental planning, communication, and governance; other out- comes of the program review process and recommendations; and suggestions to improve a department’s ability to implement program review recommendations.

Systematic content analyses of the documents and interview data have been performed. This chapter presents findings from the data that are related specifically to program review and the departments’ responses to their recom- mendations.

Responses to Program Review

The majority of those interviewed agreed that program review has been good for the administration. A new president joined the institution in 1985, fol- lowed by new appointments in the provost’s office, office of the vice president for administration and planning, and vice president for research and dean of the graduate school. Transitions in leadership occurred at the decanal level in all three schooldcolleges within the first three years of program review. Depart- ment chairs repeatedly pointed out that program review is an excellent mech- anism for providing a new administration with unbiased information on the

Page 6: Program review in academic departments

24 USING ACADEMIC PROGRAM REVIEW

status of each department in the institution. Department chairs felt that admin- istrators are reluctant to accept their word regarding their status and needs; however, program review portrays a credible image of the department and legitimates its needs. Many strong units (those with unquestionably high national visibility and others emerging into the upper ranks) saw program review as an opportunity to prove to the administration just how good they are, and this lays a positive foundation for future interaction (particularly in budgetary matters).

There was less consensus among the department chairs on the value of program review within a department. Most department chairs remarked that they and their faculty spent enormous amounts of time and energy preparing their self-studies and visiting with internal and external reviewers. Many indi- cated that the self-study phase is the most valuable. They use this opportunity to seriously reconsider their mission, identify their strengths and weaknesses, examine their niche, and identify goals that would result in a stronger depart- ment. However, reflecting on what tangible benefits result from program review (such as increased resources, particularly faculty, support staff, operat- ing budget, and space), several department chairs indicated they were dis- heartened after the process: they felt they had expended too much effort in relationship to the little they perceive they received in return. These few department chairs predicted that their departments would participate less enthusiastically in the second cycle of program review.

Responsibility for Implementing Recommendations

A total of 296 recommendations were made for the 36 departments included in this study. They ranged from four recommendations for one department to sixteen for a second department. The average number of recommendations for a department was eight.

It is important to categorize the recommendations according to the party responsible for executing the recommendation. Language such as “The depart- ment should or “The administration should” clearly assigns responsibility How- ever, many recommendations were written in the passive voice or the imperative and left the assignment of responsibility unspecified. Although in most instances responsibility for recommendation implementation could be inferred, it was not self-evident in all instances. Table 2.1 shows the distribution of the recommen-

Table 2.1. Assignment of Responsibility to Implement Program Review Recommendations

Number of Recommendation5

Assignment of Responsibility (N = 296) Percentage of Total

Administration 7 2 24% Unspecified 93 31 Department 131 44

Page 7: Program review in academic departments

PROGRAM REVIEW IN ACADEMIC DEPARTMENTS 25

dations among the three categories. Of all recommendations, 24 percent were addressed specifically to the administration and nearly twice the number assigned to the administration were addressed to the department (44 percent of all rec- ommendations). However, 3 1 percent of all recommendations did not explicitly assign implementation to either the administration or the department.

The importance of this distinction emerges through the interviews with department chairs. As they were asked to respond to each individual recom- mendation for their department and to indicate whether they had implemented each recommendation, department chairs pointed out that they could not act on a number of the recommendations because it was unclear who had respon- sibility for implementation or implementation was clearly assigned to the administration. At this point in the interviews, many chairs indicated that this exacerbated negative feelings they may have had for the program review process. They felt that they were being asked to be accountable for the rec- ommendations that were clearly their responsibility, but that the administra- tion was not asked to be accountable for implementing recommendations that were clearly its responsibility. In most instances, recommendations addressed to the administration suggested that additional resources be given to the departments.

Departmental Responses to Program Review Recommendations Table 2.2 illustrates how the recommendations were distributed when catego- rized according to their content. Eleven broad categories of recommendations emerged from the analysis of the 296 recommendations. Within each category, subcategories were developed to aggregate recommendations addressing simi- lar issues. A discussion of examples of recommendations that fall into each cat- egory and departmental responses to those recommendations follows Table 2.2.

Table 2.2. Categories of Program Review Recommendations

Number and Percentage of All Recommendations (N = 296) Issues Addressed by Program Review Recommendations

Faculty 96 32% Graduate program 42 14 Departmental support and support staff 30 10 Plans and planning 25 8

Space and equipment 24 8 Departmental mission and research focus 16 5

Departmental governance 9 3 Chairfleadership 8 3

Undergraduate program 24 8

Collaboration with other units 11 4

Other 11 4

Note: Total may not equal 100 percent because of rounding.

Page 8: Program review in academic departments

26 USING ACADEMIC PROGRAM REVIEW

Faculty. Nearly one-third of all recommendations addressed issues regarding the departments’ faculty Five subcategories made up this category. Recommendations were made to: increase, maintain, or reduce the size of the faculty; increase salaries; adjust workloads; explore joint or courtesy appoint- ments; and mentor junior faculty Generally speaking, the recommendations regarding faculty reflected a common strategy to build on the departments‘ existing strengths or to refocus their areas of specialization.

As one might anticipate, department chairs indicated that their depart- ments look favorably on recommendations regarding faculty hiring, workload reduction, and increased salaries. However, the implementation of these rec- ommendations requires additional resources from outside the department, and some department chairs expected an immediate response from the adminis- tration regarding additional support. The administration responded to these recommendalions, but not as quickly as some department chairs hoped. How- ever, regarding the recommendations to increase salaries, soon after the first cycle of program review was initiated central administration introduced an aggressive strategy to increase faculty salaries. Many department chairs expressed satisfaction (from low to high) with the administration’s response.

Regarding faculty hiring, department chairs related mixed experiences. Some indicated that they understand that financial pressures on the schools and college require the deans to prioritize hiring decisions and that not all rec- ommendations for all faculty hiring could be implemented immediately. Nev- ertheless, many department chairs explore creative ways to create new positions in their departments. For example, they use bridge appointments to fill the positions of faculty anticipating retirement and they use joint or cour- tesy appointments to build faculty strength. Few chairs expressed disappoint- ment at administrative failures to provide resources to allow departments to recruit additional faculty.

Graduate Program. Fourteen percent of the recommendations addressed the departments’ graduate programs. Four subcategories were created for these recommendations: revise the curriculum, strengthen the quality of entering students, increase support (primarily by increasing the numbers of Leaching assistantships and fellowships), and improve advising and monitoring of progress toward degree.

When department chairs described their departments’ responses to rec- ommendations about their graduate programs and what they did to implement those recommendations, most indicaled that the program review recommen- dations did not bring up problems in their graduate programs that they were not already aware of themselves. However, in a few cases, department chairs indicated that until they had read the program review council subcommittee’s final report, they were not aware of the severity of a particular problem.

Nevertheless, the department chairs indicated that they move quickly to address recommendations to improve their graduate curricula, quality of their entering students, and advising. However, they implement improvemenLs with- out the creation of new committees. In most cases, department chairs work

Page 9: Program review in academic departments

PROGRAM REVIEW IN ACADEMIC DEPARTMENTS 27

through existing structures in their departments, such as a graduate affairs committee, or with the director of graduate affairs. Departments review and revise their curricular offerings, write graduate student handbooks that address expected progress and time to degree, develop aggressive recruiting programs, design more attractive brochures, encourage faculty to personally contact prospective students, and establish accounts (with support from their schools and college) to provide funds for campus visits of prospective candidates.

Recommendations for increases in financial support for graduate students can be addressed only by the central administration of the graduate school. With regard to financial support for graduate students, the department chairs at this institution probably agree with most other department chairs across the country: there is never enough financial support from the administration for graduate stu- dents. That notwithstanding, many department chairs are persistent and find ways to manage allocation of resources for graduate students. For example, adjustments have been made in multiyear funding arrangements, recruiting strateges (when offers of financial support can be made and how many may be extended simul- taneously), and numbers of assistantships and fellowships available.

Departmental Support and Support Staff. Only 10 percent of all rec- ommendations addressed departmental support and support staff. Three sub- categories emerged: increase the department’s operating budget, increase support staff, and increase support for colloquia, seminars, and other intellec- tual programs. Among these three categories, nearly half of the recommenda- tions suggested that the administration provide increased operating support for departments and half suggested that the departments themselves should be more aggressive in their efforts to raise funds.

Many of the department chairs agreed that they need to pursue outside funding more aggressively, but few departments initiated any activities. These few departments may be categorized as entrepreneurial. They are skilled at identifying external funding sources and at raising matching funds (even within the institution).

It may be surprising to learn that the administration in many reviews quickly responded to the needs of departments by providing additional funds to their operating budgets for colloquia, seminars, or support personnel. Given the relatively small numbers of recommendations in this category, one might appropriately conclude that the program reviews provided compelling evi- dence of the need for additional resources in these subcategories.

Plans and Planning. Eight percent of all recommendations addressed plans and planning within departments. Two subcategories emerged. Recom- mendations in the first subcategory directed units to develop plans. Generally, these recommendations were addressed to departments that had not articu- lated clear missions, directions, or focus in their self-studies. The department chairs in these units responded quickly and positively to develop plans. In some instances, they were motivated to do so because future faculty hiring and the continuation of their graduate programs are contingent on the develop- ment of departmental plans.

Page 10: Program review in academic departments

28 USING ACADEMIC PROGRAM REVIEW

The recommendations in the second subcategory directed the development of plans for initiatives across units. The revision of the undergraduate biologi- cal and life sciences curriculum illustrates the implementation of this recom- mendation. Examples of additional plans cutting across units include initiatives among engineering departments, engineering and business departments, lan- guage departments, religious studies, and communication departments. Two university-wide initiatives (in artificial intelligence and environmental engi- neering) also required the development of plans across units. Although these examples show that these recommendations can be successfully implemented, department chairs indicated that implementation can be difficult. Encourage- ment and participation by the school administration and central university administration facilitate implementation.

Undergraduate Program. Eight percent of all recommendations were related to the departments’ undergraduate programs. This is slightly more than half the percentage of recommendations for the departments’ graduate pro- grams. At this institution, the subcommittee reports reflected more variance among the perceived quality of the institution’s graduate programs than among the undergraduate programs. Generally, the undergraduate programs were viewed to be strong. Three subcategories of recommendations were identified: revise the curriculum, strengthen the major, and improve advising.

Department chairs’ responses echoed the overall assessment that the pro- gram review process did not reveal any problems that they were not already addressing. Department chairs responded that existing undergraduate com- mittees, directors of undergraduate studies, or small departments as a whole were already reviewing and revising their curriculum, developing strategies to strengthen the major (for example, instituting a capstone course), and improv- ing advising.

A noteworthy outcome of the program review illustrates the impact it can have not just within a single unit but across units. As a result of the reviews of the units in which the life sciences are taught, a crossdepartmental committee was created to revise the entire undergraduate curriculum in the biological and life sciences.

Space and Equipment. Eight percent of all recommendations addressed departmental space and equipment needs. Four subcategories were created: increase space; improve facilities, labs, and library holdings; provide new equipment; and improve the computing and technological environment.

Clearly, these kinds of recommendations require the allocation of resources from the administration. Department chairs cannot implement these recom- mendations with existing departmental resources. The physical facilities of the institution had declined over the years under its deferred maintenance pro- gram. After the new president arrived at the institution, he instituted an aggres- sive strategy to rehabilitate the institution, and many of the department chairs recognized improvements. One issue that several departments across all disci- plines raised in their program reviews was the need for contiguous space. Some of the departments have seen this issue resolved to their satisfaction, but the

Page 11: Program review in academic departments

PROGRAM REVIEW IN ACADEMIC DEPARTMENTS 29

issue continues for others. A great deal of progress has been made in the com- puting environment for both the institution and for individual departments. The president established a matching computing fund for all new tenure-track faculty so that their departments would have startup funds available for com- puters, and funds were allocated to deparlments for more personal computers and networked computing stations.

Departmental Mission and Research Focus. Five percent of all recom- mendations addressed the departments’ mission and research focus. Two sub- categories emerged: redefine or refine the departmental mission and focus the department’s research emphases. Generally, program review council subcom- mittees made these recommendations for departments that were attempting to cover too many research areas and could be strengthened if they refined their focus.

Department chairs responded favorably to these recommendations. They agreed that their departments need to reduce their breadth of coverage and to build on their obvious strengths. Implementation of these recommendations is reflected in curriculum revisions, reconfiguration of research emphases, and devel- opment of hiring plans targeting new faculty with specific areas of expertise.

Collaboration with Other Units. Four percent of all recommendations suggested that departments collaborate with other units. They should increase collaboration to build critical mass and increase collaboration with units shar- ing similar interests. The subcommittee reports suggested that collaboration with existing units on campus could strengthen a department by adding crit- ical mass to small departments or adding breadth or depth to their academic and research programs. These recommendations are particularly important in an institution that considers itself to be in a steady state. These recommenda- tions support an institutional strategy to build on existing strengths and resources.

The department chairs remarked that their departments generally respond positively to such recommendations, but implementation is problematic. In many cases, collaboration means the establishment of joint faculty appoint- ments or cross-listing of courses. It is difficult for department chairs to initiate and establish collaborative agreements across departmental, college or school, and campus boundaries. Salary and teaching workload differentials exist across units. Some department chairs indicated that strong units are reluctant to col- laborate with what appear to be weaker units. Unless there is administrative leadership at the school, college, or central level, these recommendations are not an immediate priority for the departments.

Departmental Governance. Only 3 percent of all recommendations related to departmental governance or how decisions are made within depart- ments. One subcategory of recommendations suggested that new or additional structures be created within departments. For example, departments should add a director of graduate studies or a director of undergraduate studies where these were absent. A department could assign someone to be responsible for developing a seminar or colloquia program.

Page 12: Program review in academic departments

30 USING ACADEMIC PROGRAM REVIEW

A second category of recommendations suggested that the department change its style of governance. In some reviews, subcommittee members remarked that departments appear to be too democratic. These are depart- ments that encourage faculty debate and voting on all departmental matters, which most would agree reflects a positive participatory style of governance. However, subcommittee members believed that this places an undue burden on each faculty member to engage in weekly departmental meetings to debate all issues on all departmental matters. Strong departments resisted recom- mendations that their mode of governance be changed. In other reviews, sub- committee members identified factions within departments that cause the departments to appear to be dysfunctional. Department chairs who agreed that the subcommittee members identified a real problem within their departments moved quickly to address these concerns when possible. However, factions within departments may exist because of disciplinary differences, method- ological differences, or personality differences. There is little department chairs can do if dissension within the department is due to personality clashes.

Department Chair and Leadership. Only 3 percent of the recommen- dations addressed a department3 chairmanship and leadership. The recom- mendations suggested that the department strengthen the role of the chair or that new leadership be recruited for the chair’s position. In many departments, these recommendations to strengthen the power of existing chairs or to hire from the outside eminent scholars to lead the department were not imple- mented. Cultural resistance within the department to change the role of the chair partially explains the .failure to implement this type of recommendation. Instead of hiring new chairs from the outside, internal candidates are selected; sometimes this strategy is chosen by the department, other times by the administration.

The amount of turnover that occurred in department chairs over the six- year program review cycle may appear surprising. For thirty-seven depart- ments, sixty-six people served as chairs for the period under study (those who served during the review and those who were appointed in succession up to the 1992-93 academic year). In many departments, turnover in chairs was nat- ural because of the expiration of terms or because the department chair wanted to step down. In some other departments, turnover was created because the program review indicated that there was a need for a change of leadership in the department. Partially due to the information acquired through program review, the administration established term limits for department chairs and developed an annual orientation program for department chairs. Turnover in department chairs has a direct impact on program review and the implemen- tation of program review recommendations. One cannot assume that suc- ceeding chairs have the same knowledge regarding program review agreements and follow-up activities that may have been articulated with their predecessor.

Other. Four percent of the recommendations touched on other matters, but there was no common theme among these recommendations. In other words, these recommendations were one-of-a-kind and directed toward indi-

Page 13: Program review in academic departments

PROGRAM REVIEW IN ACADEMIC DEPARTMENTS 3 1

vidual units. Citing them here would violate a promise to neither reiterate spe- cific recommendations nor identify individual departments in this study.

Better Off or Worse Off Because of Program Review?

Toward the end of the interview, each chair was asked, “Is your department better off or worse off today because of program review?” Nearly two-thirds of the department chairs responded that their departments are better today than they were at the time of their review, and nearly three-fourths of those depart- ment chairs agreed that program review has contributed in some way to their improvement. Department chairs remarked that program review serves two important functions. First, it provides them an important opportunity to estab- lish credibility with the administration. Consequently, better decisions can be made regarding institutional priorities and the allocation of resources. Second, it helps communicate the quality of their departments outside the institution and across the country through the visits of the external evaluators.

A little more than one-third of the department chairs felt that their depart- ments are not better off because of program review. Of those department chairs, one-third felt they are worse off. In very few instances, department chairs said that because they have not changed since program review, they do not think they are better off. This response was generally offered by nationally top-ranked departments. This response raises one issue regarding program review at this institution. Program review is viewed as a mechanism for change or as a tool to improve the quality of a department. This implies that the sta- tus quo is no longer acceptable. Every department wants to improve, and this includes the top-ranked departments.

Also among the. group of department chairs who felt that their depart- ments are not better off because of program review, two-thirds felt that their departments are no different following program review. Among these, many indicated that the recommendations made are for activities that they are engaged in independent of program review. The program review recommen- dations in most instances told them nothing that they did not already know. In other words, they saw little or no value added to the quality of their depart- ment after having engaged in the process. Others indicated that although their departments received additional resources following program review, they could not attribute their allocation to the program review process. This response represents a second issue in program review. The units at this insti- tution are explicitly told that they should not expect additional resources fol- lowing their reviews. Although many department chairs agreed they heard this message, they nevertheless raised their hopes. Consequently, they were disap- pointed when they did not receive resources for what they felt were compelling needs. The administration, in return, was reluctant to articulate that certain resources were allocated directly as a result of program review. In other words, the administration hesitated to say, “As a result of your program review, you will be allocated two additional faculty positions” or “You will receive an

Page 14: Program review in academic departments

32 USING ACADEMIC PROGRAM REVIEW

increase in your operating budget.” Some exceptions were made. Generally, however, the administration referred back to the results of program review and incorporated program review findings in resource allocation decisions made through the institution’s normal budget processes. As a result, department chairs were uncertain how or when administrators were responding to pro- gram review recommendations. Department chairs seldom expressed the view that they knew that administration made an allocation to them based on their program review.

The design of this program review process itself contributes to this prob- lem. Each unit’s program review is treated confidentially. That is, only the members of the unit, the program review council members, and appropriate administrators see the reports of the units. Therefore, units do not officially know the findings of the program reviews of other units. Likewise, they do not know officially what administrative decisions have been made based on the reviews of other units.

Suggestions to Facilitate Implementation of Recommendations According to the administrators and department chairs, program review requires a lot of time and eneru. Therefore, it is essential that departments per- ceive that the value of the process surpasses its investment of money, time, and energy

Department chairs were asked, “What would you do to improve the pro- gram review process to facilitate the implementation of program review rec- ommendations?” Most department chairs responded that the program review process itself is good and requires little or no change. They especially value the visits by the external reviewers. They find the self-study process worthwhile, but taxing on the department. Remarks regarding the subcommittee reports were generally positive, but there were concerns about the quality of their per- formance and reports in isolated cases.

Although the department chairs indicated they are generally satisfied with the program review process, most remarked that attention must be paid to the period after program review when departments and the administration begin implementing recommendations. The following summarizes problems depart- ments face when implementing program review recommendations and rec- ommendations to address them.

Program Review’s Half-Life. Schmidtlein and Milton (1988-89) point out that plans have a shelf life of one to three years. Likewise, several depart- ment chairs suggested that program review has a half-life of one to two years. Departments may act on their recommendations the year immediately follow- ing their review, but they seem less able to sustain attention to the recom- mendations over time. This problem is exacerbated by turnover in department chairs. Sometimes, program review may be forgotten in the transition between chairs. Some of the succeeding chairs indicated that they reviewed their pro-

Page 15: Program review in academic departments

PROGRAM REVIEW IN ACADEMIC DEPARTMENTS 33

gram review reports when they assumed their positions, and they suggested that other chairs might do the same.

Continuing Administrative Attention. Lack of continuing administra- tive attention diminishes a department’s momentum to implement recom- mendations. If the dean and department chair do not maintain an on-going dialogue, implementation may fall to the side. Some department chairs sug- gested that their school and college administrators meet with them periodi- cally to discuss progress regarding implementation.

Administrative Accountability. Departments were frustrated by perceived administrative failure to implement program review recommendations. They felt they were asked to be accountable for implementing recommendations that were assigned to them, but that the administration did not have to account for rec- ommendations assigned to it. Many of the department chairs suggested that implementation of recommendations would be improved if explicit (and per- haps written) agreements were made at the end of the process about who would do what to implement which recommendations (this is now done). This means that the administration would be required to articulate which recommendations it would actively work with the department to implement and what resources it was willing to make available to the department if additional resources were needed. Likewise, when two-year and four-year follow-up reports are due, some department chairs suggested that the administration should indicate what it has done, what it has not done, and what it is willing to do for the department to help it implement its program review recommendations.

Link Between Program Review and Its Outcomes. Many department chairs responded that their departments are better off after program review. However, many indicated that they do not perceive a direct link between pro- gram review and the improved quality of their unit. What indicators do department chairs cite to demonstrate improved quality? Department chairs indicated that they are recruiting higher-quality graduate students (with mea- sures in addition to standardized test scores or undergraduate experiences), they are successfully recruiting their first choices in new faculty hires, higher- ranked departments are attempting to recruit away their faculty, they are com- peting more successfully for externally funded research resources, and their facilities are improved. Understandably, it is difficult to see a direct link between program review and these measures of improved quality.

As mentioned earlier, a major challenge of program review is to contain a department’s expectations for additional resources. Nevertheless, department chairs indicated that they would like administration to communicate to them which administrative decisions have been influenced by program review. This will help departments better gauge the investment of their time and energy in the program review process for the benefits returned. This may also help departments recognize the link between program review outcomes and qual- ity improvement.

This may lead one to construct the following view of program review: infor- mation is the newly minted coin of the realm. Program review is a minting

Page 16: Program review in academic departments

34 USING ACADEMIC PROGRAM REVIEW

process. Departments allocate coins (information) to the administration through the process. The administration needs to show what the departments’ coins have purchased. What those coins have purchased has to be valuable. At this institution, the department chairs indicate that they need more evidence of what their coins have purchased and that it is valuable.

However, this view is too simplistic. It presumes that there is a direct cause-and-effect relationship between program review and resource allocation decisions. This does not accurately depict how decisions are made in colleges and universities. It is true that program review creates a base of information for decision making. This information may be considered valuable because it is metered by faculty both internal and external to the institution and because it speaks to the quality of the institution’s programs and services. However, this information is then fed into the institution’s internal management processes, such as those addressing hiring plans, budget planning, capital planning, and development efforts. In sum, it is unrealistic to expect information from pro- gram review to “buy” faculty and staff positions, physical space, equipment, or other departmental resources.

Link Between Program Review and Planning. The department chairs did not see a missing link between program review and planning, but this is a concern I would raise. Although program review is intended to promote departmental planning as the key to stimulating improvement, surprisingly few departments indicated that program review has an impact on departmen- tal planning. When asked how program review affected planning in their departments, department chairs responded in three ways. Some department chairs indicated that program review makes their departments think about where they were, where they wanted to go, and what they needed to do to get there. Other chairs responded that program review does not change the way they plan and, furthermore, they are always planning. They engage in plan- ning when beginning searches for new faculty hires or projecting curricular offerings and teaching schedules. The remaining few department chairs asked me what I meant by planning. The responses by department chairs in these latter two categories indicate that there is sufficient need for the administra- tion to establish a program that teaches formal planning.

How Does This Relate to Planning and Institutional Research Functions?

Many of the findings from this study may resonate with the experiences of planners and institutional researchers who manage or participate in program review. Many of the issues identified in this study have been raised on other campuses. However, if an institution is launching program review for the first time, the findings reported here may aid decision making about how to design and conduct a program review process; they may also help the institution avoid some of the issues identified by the department chairs engaged in the process at this institution.

Page 17: Program review in academic departments

PROGRAM REVIEW I N ACADEMIC DEPARLMENTS 35

One of the challenges in a comprehensive program review process is to not lose sight of the forest for the trees. Although it is important to treat the reviews of each individual unit uniquely, someone needs to ensure that a per- spective is kept across all units. The office in charge of program review plays a critical role.

A second challenge in a comprehensive program review process is LO sus- tain open channels of communication. Departments provide enormous infor- mation to the administration, but indicate that insufficient information conies to them from the administration. It is equally important that the office in charge of program review sustain two-way communication between depart- ments and administration. All correspondence from departments must bc acknowledged.

Institutional program review can be managed within one of a number of administrative units. For example, program review may be managed by the provost’s office, planning office, institutional research office, or the deank office at the school or college level. At this institution, program review is managed by the office of the vice president for administration and planning. Institutional research is also one of the functions of this office. The institutional research function supports the data needs of the program review process; professional staff in the office manage the process and provide substantive analyses of find- ings across units; and the vice president, among many other program review responsibilities, sustains communication among all participants in the process (from the levels of the department to the university’s board of trustees). The office of the vice president ensures that the participants keep their eyes on both the forest and the trees and serves as the conduit for communication from the administration to the departments. Although this model may not work well for all institutions. it works well in this institution.

Concluding Remarks

Two summary comments can be made based on the results of this study. First, departments do not act on all program review recommendations. They are more likely to act on recommendations when the responsibility for imple- menting the recommendation is clearly implied or explicitly assigned to them, the goal of the recommendation matches the departmenr’s own goals, and no additional resources are needed. Second, the administration does not act on all recommendations emanating from program review each year. It would be unreasonable to assume that i t could. Instead, the administration uses the information from program review to establish priorities for the allocation of resources. It also develops broader policies that address issues and introduces new programs and initiatives across units.

What makes program review most effective is the credible information it can provide faculty and administrators so they can identify areas that require immediate attention and establish priorities that can be implemented over time. According to many of the department chairs interviewed in this study, most

Page 18: Program review in academic departments

36 USINL ACADLMIC PROGRAM R L V ~ F W

recommendations are o n target. Furthermore, department chairs acknowledged that 1101 every recommericlation can be implemented immeciiately Priorities must be set. They understand that the needs of the department must be bal- anced with the needs of the college and school and the institution.

References

A r m , R. G., and Poland, M’. “Changing the University Through Program Review.” Journal qf Highcr Edusution, 1980, 51 ( . 3 ) , 268-284.

Harak, R. J. Progrum I<cvirw in Ilighcr Edutulion: Within and Without. Boulder, Colo.. National Center for Higher Etliication Management Systems, 1982.

Barak, R. J . Thr Role o\ Progrum Rc in Strutpgic. Planning. AIR Professional File, no 26. T;illal~assec, Fla : Association fo titutional Research, 1986.

Uarak. R . J . , and Brcicr, R E. Sue LC:SS/U/ Program Rc,viciu: A Pructrtul Guidc Lo Evduating Pro- grums in Acudimic. Setlings. San Fran o: lossey-Bass, 1990.

Breicr, R. E. “Program Review I’olicy i ntent. Iniplenientation and Experience: A Case Study.” 1-awrence: University of Kansas. Dissertation Abstracts International, 47/2, 430A (Ilniversity Microfilms no. 860533713), 1 Y86.

w s : Institutional Approuches, Expec- lulions, und Ceititrovenics. AS) IE-ERIC Higher Education Report, no. 5. Washington, D.C.: Association lor the Study of Higher Education, 1985.

Cranton, P. A. , and Legge, L. H. “Prograrn Evaluation in Higher Education ”Jriurnul of Higher Eeluiatiein, 1978, 49 (5), 464-471.

C.ravcn, F C. (ed.). A i d c m i z Pnigrum Evaluation. New I>irectims for Instiiutiortal Rcseai-ch, n o 27. San Francisco: Jossey-Bass, lW0.

Feasley, C. E. I’rogruni L;valuu~icin. AAHE-ERIC/Higher Education Rrsearch Report, no. 2 . Washington, D . C : Amei-ican Assciciaticiii for Higher Education, 1980

Cardner, D. E. “Five Evaluation Framework?: lrriplications for Decision Making in Higher Educarion “Journcll uj f f i ghrr Etlur.uiion, 1977, 4N (5), 571-593.

Kells, H. R. “The Purposes m d Legacy of Eflective Self-Study Processes: Enhancing the Study-Planning Cycle.” Journul of Higher Educufiori, 1980, Sf (4), 439-447.

M e t s , L. A . “Implementation Strategies in a University Settirig: Departmental Responses to Program Review Reconimeiidations.” Docloral dissertation, University of Michigan, in

Office ol the Vice President tor tldimnistration and Planning. Acudrmii and Administrutivr

Schmidtlein, F A,? a n d Mil ton, T. 1.1 “College and University Planning: Perspectives from

Wilson, R. F. Ilchifitiing Acudernic. Progrum Rrvicws. New Directions for Highcr Education,

Conrad, C. F., And Wilson, R i;. ALudcrriic Pt.cigrurn R

[11-€5S.

Program Rrwiew f’rocedures Fvansion, 111.: Northwestern University, 1989.

a Nation-Wick Study.” PhnriiriSJor Hlghet. Edu~u t ion , 1988-89, 17 (31, 1-1 9.

o: Jossey-Bass, 1082.

L r s ~ A. M e n is seniur administrator in the CenierJor Reseurch on Learning and Peaching at the liniversily vf Michigan, Ann Arbor; and C I doctoral candidate in the Univel-sity (!f Michigan Center {or the Study r $ Higher and Postsecondary EduLation.