faculty, graduate student, and graduate productivity in public administration and public affairs...

9
Faculty, Graduate Student, and Graduate Productivity in Public Administration and Public Affairs Programs, 1986-1993 Author(s): James W. Douglas Source: Public Administration Review, Vol. 56, No. 5 (Sep. - Oct., 1996), pp. 433-440 Published by: Wiley on behalf of the American Society for Public Administration Stable URL: http://www.jstor.org/stable/977042 . Accessed: 15/06/2014 08:32 Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at . http://www.jstor.org/page/info/about/policies/terms.jsp . JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact [email protected]. . Wiley and American Society for Public Administration are collaborating with JSTOR to digitize, preserve and extend access to Public Administration Review. http://www.jstor.org This content downloaded from 185.2.32.109 on Sun, 15 Jun 2014 08:32:30 AM All use subject to JSTOR Terms and Conditions

Upload: james-w-douglas

Post on 20-Jan-2017

222 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Faculty, Graduate Student, and Graduate Productivity in Public Administration and Public Affairs Programs, 1986-1993

Faculty, Graduate Student, and Graduate Productivity in Public Administration and PublicAffairs Programs, 1986-1993Author(s): James W. DouglasSource: Public Administration Review, Vol. 56, No. 5 (Sep. - Oct., 1996), pp. 433-440Published by: Wiley on behalf of the American Society for Public AdministrationStable URL: http://www.jstor.org/stable/977042 .

Accessed: 15/06/2014 08:32

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at .http://www.jstor.org/page/info/about/policies/terms.jsp

.JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range ofcontent in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new formsof scholarship. For more information about JSTOR, please contact [email protected].

.

Wiley and American Society for Public Administration are collaborating with JSTOR to digitize, preserve andextend access to Public Administration Review.

http://www.jstor.org

This content downloaded from 185.2.32.109 on Sun, 15 Jun 2014 08:32:30 AMAll use subject to JSTOR Terms and Conditions

Page 2: Faculty, Graduate Student, and Graduate Productivity in Public Administration and Public Affairs Programs, 1986-1993

Faculty Grauate Studenst, and Graduate Productivity in Public Adininistraton and Public Affirs Programs, 19864993

James W Douglas, University of Georgia

How productive have the faculties, graduate students, and gradu- ates ofpublic administration andpublic affairs programs been over the past several years? In order to answer this question, the author examined 11 journals published between 1986 and 1993. Publication totals were used to measure the productivity ofpro- grams. The author notes that many of the programs found to have highly productivefaculties in earlier studies by Legge and Devore (1987) and Morgan et al. (1981) have maintained top positions, while the remainder of the programs tended to change positions in an unpredictable manner. A relationship was also found to exist between programs with productive faculties and programs with productive graduate students and graduates. In addition, the author reveals thatfew scholars published more than two articles in the journals under review between 1986 and 1993.

It has been eight years since the productivity of pub- lic administration and public affairs programs was last measured by examining the faculty publications in selected journals (Legge and Devore, 1987). The purpose of this article is to duplicate and update the rankings presented by Legge and Devore and Mor- gan et al. (1981) and analyze how program produc- tivity has changed over the years. In addition, in an effort to ascertain the effectiveness of public adminis- tration programs in training and motivating students to conduct scholarly research, I measured the publi- cation productivity of the graduate students and graduates of public administration and public affairs programs. The analysis will show that a relationship exists between faculty and student productivity.

The role of public administration programs is not just to teach, it is also to advance the field and find ways to improve the practice of public administra- tion. Providing faculty productivity measures is meaningful because they indicate the extent to which programs are contributing to the knowledge in the field. Graduate student and graduate productivity measures are useful because they are a means of determining the degree to which programs are preparing students to add substantively to the field. Uncovering which institutions tend to excel in these tasks will provide a starting point from which to determine the program characteristics that lead to important research. It will also furnish prospective doctoral students with information that can assist them in selecting graduate schools that best meet their educational needs.

Public Administration Review * September/October 1996, Vol. 56, No. 5 433

This content downloaded from 185.2.32.109 on Sun, 15 Jun 2014 08:32:30 AMAll use subject to JSTOR Terms and Conditions

Page 3: Faculty, Graduate Student, and Graduate Productivity in Public Administration and Public Affairs Programs, 1986-1993

Updating the Rankings: 1986-1993

To employ the same methodology used by Morgan et al. and Legge and Devore, in this analysis, I adopted the journal article as the unit of analysis.' Programs were ranked based on the number of articles published by faculty members in 11 journals for the eight- year period 1986 through 1993. To remain consistent with previous studies, I examined the 10 journals used by Morgan et al. and Legge and Devore. These journals included: Journal of Poli- cy Analysis and Management, Policy Stud- ies Journal, Policy Studies Review, Admin- istration and Society, Public Administration Review, American Review of Public Administration, Public Adminis- tration Quarterly, InternationalJournal of Public Administration, The Public Man- ager, and National Civic Review.2 For the purpose of examining the productiv- ity of programs in public administration versus public policy journals, the first three journals listed were classified as public policy and the remaining journals as public administration.

In addition, I added The Journal of Public Administration Research and Theo- ry3 because it did not exist at the time of the earlier studies and has become recog- nized as a top journal in the field (For- rester and Watson, 1994). Because it could be argued that adding The Journal of Public Administration Research and Theory to the analysis damages the com- parability of this study with the Morgan et al. and Legge and Devore studies, totals which exclude publications in The Journal of Public Administration Research and Theory are also presented in this article in the Notes.4

As Legge and Devore recognized, using this method to measure the pro- ductivity of programs is incomplete. Other forms of faculty output such as books, government reports, articles in journals that are more specific to certain public administration subfields (such as budgeting or personnel), and articles "in journals of basic disciplinary-focused research (such as American Political Science Review)" (Legge and Devore, 1987; 148) are not included in the study. Despite the importance of these types of scholarly output, I examined only the listed journals for several reasons. First, and most important, using

Table 1 Faculty Productivity Ratings and Number of Publications, 1986-1993

Overall Overall Policy PA Ranking, Ranking, Journals, Journals, Total 1970-80a 1981-85b School 1986-93 1986-93 1986-93

2 (tie) 1 Georgia 4.0 49.08 53.08 4 6 Southern California 5.0 38.0 43.0 2 (tie) 5 Syracuse 11.5 29.67 41.17 1 8 Indiana 6.0 27.09 33.09 17 3 Virginia Tech 1.33 30.42 31.75 5 (tie) 13 SUNY, Albany 5.5 22.25 27.75 21 (tie) 2 Florida State 4.0 23.7 27.7 45 12 George Washington 3.0 20.97 23.97 42 10 Georgia State 2.0 21.81 23.81 36 (tie) 21 (tie) Missouri, Columbia 3.0 20.33 23.33 8 16 American 2.0 21.0 23.0 34 (tie) 19 Oklahoma 5.5 15.67 21.17 5 (tie) --- California, Berkeley 6.33 12.33 18.67 --- 18 North Carolina State 1.0 17.50 18.50 18 (tie) 7 Kansas 2.0 16.0 18.0 --- --- Colorado, Denver 6.33 11.33 17.67

9 Arizona State 9.17 8.5 17.67 --- --- San Diego State 1.0 15.58 16.58 --- --- Baltimore 0.5 14.91 15.41 7 4 Harvard 5.0 10.33 15.33 --- --- Penn State, Harrisburg 0.0 15.09 15.09 --- --- South Florida 3.0 11.31 14.31 21 (tie) 32 (tie) Rutgers 5.0 9.17 14.17 13 48 (tie) North Carolina 6.0 8.09 14.09 --- 43 Auburn 2.75 11.33 14.08 --- 28 Rider 0.0 14.0 14.0 31 --- Northern Illinois 2.83 10.83 13.67 --- --- Wisconsin, Madison 6.17 7.0 13.17 20 15 Penn State 4.0 8.86 12.86 27 17 South Carolina 2.25 10.58 12.83

11 George Mason 1.5 11.0 12.5 --- --- Cleveland State 2.0 10.46 12.46 --- --- Louisiana State 2.0 10.17 12.17

--- --- Wyoming 1.5 10.33 11.83 47 (tie) --- CUNYBaruch 4.0 7.17 11.17 34 (tie) 48 (tie) Missouri, St. Louis 3.0 8.0 11.0 --- --- SUNY, Binghamton 4.0 6.5 10.5 --- 20 Texas A & M 5.0 5.5 10.5 36 (tie) --- Vermont 2.5 7.5 10.0 47 (tie) 14 Connecticut 2.0 8.0 10.0 40 (tie) 26 Minnesota 3.0 7.0 10.0 21 (tie) 32 (tie) Washington 5.5 4.33 9.83 --- --- Florida International 2.0 7.5 9.5 12 44 (tie) Pittsburgh 4.17 5.0 9.17 14 (tie) --- Michigan 7.33 1.75 9.08 --- 448 (tie) Southern Illinois 3.0 6.0 9.0 --- --- Florida Atlantic 0.5 8.5 9.0

--- 40 (tie) Wisconsin, Milwaukee 2.5 6.33 8.83 --- --- Washington State 1.5 7.0 8.5 28 (tie) --- Texas, Austin 2.0 6.5 8.5 --- 40 (tie) Maine 0.0 8.5 8.5 --- --- Nebraska, Omaha 3.0 5.5 8.5 --- --- Oklahoma State 3.5 4.5 8.0 --- 40 (tie) Western Michigan 1.5 6.25 7.75

a Morgan etal. study (1981). b Legge and Devore study (1987).

the listed journals enables the work of Morgan et al. and Legge and Devore to be updated. This approach permits a cumulative look at academic productivity within public administration and public affairs programs. Second, journals of basic disciplinary-focused

434 Public Administration Review * September/October 1996, Vol. 56, No. 5

This content downloaded from 185.2.32.109 on Sun, 15 Jun 2014 08:32:30 AMAll use subject to JSTOR Terms and Conditions

Page 4: Faculty, Graduate Student, and Graduate Productivity in Public Administration and Public Affairs Programs, 1986-1993

Table 2 Percentage Difference in Productivity, 1981-1985 versus 1986-1993

'81-'85 (x1.6) Percentage

School 1981-85 ADJ. 1986-93 Change

1. Georgia 31.96 51.14 53.08 +4 2. Southern California 20.16 32.26 43.0 +33 3. Syracuse 20.5 32.8 41.17 +26 4. Indiana 14.83 23.73 33.09 +39 5. Virginia Tech 22.5 36.0 31.75 -12 6. SUNY, Albany 12.5 20.0 27.75 +39 7. Florida State 23.0 36.8 27.70 -25 8. George Washington 12.83 20.53 23.97 +17 9. Georgia State 13.5 21.6 23.84 +10 10. Missouri, Columbia 9.5 15.2 23.33 +53 11. American 11.0 17.6 23.0 +31 12. Oklahoma 10.0 16.0 21.17 +32 13. California, Berkeley --- --- 18.67 --- 14. North Carolina State 10.33 16.53 18.5 +12 15. Kansas 14.99 23.98 18.0 -23 16. Colorado, Denver --- --- 17.67 --- 16. Arizona State 14.0 22.4 17.67 -21 18. San Diego State --- --- 16.58 --- 19. Baltimore --- --- 15.41 --- 20. Harvard 21.82 34.91 15.33 -56 21. Penn State, Harrisburg --- --- 15.09 --- 22. South Florida --- --- 14.31 --- 23. Rutgers 6.5 10.4 14.16 +36 24. North Carolina 4.5 7.2 14.09 +96 25. Auburn 5.33 8.53 14.08 +65 26. Rider 7.33 11.73 14.0 +19 27. Northern Illinois --- --- 13.67 --- 28. Wisconsin, Madison --- --- 13.17 --- 29. Pennsylvania State 11.36 18.18 12.86 -29 30. South Carolina 10.5 16.8 12.83 -24 31. George Mason 13.0 20.8 12.5 -40 32. Cleveland State --- --- 12.46 --- 33. Louisiana State --- --- 12.17 --- 34. Wyoming --- --- 11.83 --- 35. CUNY, Baruch --- --- 11.17 --- 36. Missouri, St. Louis 4.5 7.2 11.0 +53 37. SUNY, Binghamton --- --- 10.5 --- 37. Texas A & M 9.83 15.73 10.5 -33 39. Vermont --- --- 10.0 --- 39. Connecticut 12.33 19.73 10.0 -49 39. Minnesota 8.0 12.8 10.0 -22 42. Washington 6.5 10.4 9.83 -5 43. Florida International --- --- 9.5 --- 44. Pittsburgh 5.0 8.0 9.17 +15 45. Michigan --- --- 9.08 --- 46. Southern Illinois 4.5 7.2 9.0 +25 46. Florida Atlantic --- --- 9.0 --- 48. Wisconsin, Milwaukee 5.5 8.8 8.83 0 49. Washington State --- --- 8.5 --- 49. Texas, Austin --- --- 8.5 --- 49. Maine 5.5 8.8 8.5 -3 49. Nebraska, Omaha --- --- 8.5 --- 53. Oklahoma State --- --- 8.0 --- 54. Western Michigan 5.5 8.8 7.75 -12

research contain articles from a variety of fields. Determining which articles should be classified as public administration would be an unmanageable task. Third, it would be difficult to attain a list of government reports and public administration books. Fourth, determining how heavily to weigh government reports and

books in comparison to journal articles would be troublesome. Finally, government reports and books are not usually subject to the same rigorous review processes as are journal articles. It should be noted that the indicators of productivity used in this article are measures of faculty, graduate student, and graduate activity. No attempt is made to measure the impact that the publications under review have had upon the field.

Faculty Publication Totals Tables 1 through 3 present the results of the faculty publication

ratings for 1986 to 1993. Table 1 provides the publication totals for the top 54 programs.5 These totals are further divided to show the number of publications produced within public policy and public administration journals. Table 1 also includes the ratings computed by both the Morgan et al and the Legge and Devore studies. It should be remembered that the productivity totals were derived only from a select list of journals. As a result, programs that produce large numbers of books, government documents, and/or articles in other journals may rank low in Table 1.

Between the years 1986-1993, almost 1,100 faculty members from over 300 institutions contributed approximately 1,500 arti- cles to the listed journals. An average of 4.79 articles (3.54 public administration and 1.25 public policy) were produced by each pro- gram. Table 1 reveals that most of the top programs listed more than doubled this amount; the top 25 institutions averaged 23.30 articles, and the next 25 averaged 10.63 articles per program.

Significant changes from the previous rankings of public administration programs can be found in Table 1. Although 7 of the top 10 schools from the 1981-85 rankings by Legge and Devore (Georgia, Southern California, Syracuse, Indiana, Virginia Tech, Florida State, and Georgia State) remained in the top 10, Georgia, Southern California, Syracuse, and Indiana were the only schools to hold positions in the top 10 for each of the three time periods. Two of these schools, Georgia and Syracuse, managed to maintain their top 5 status in each of the three studies. Only half of the top 20 schools in the Morgan et al. study (1970-80) remained in the top 20. However, 15 of the top 20 schools from the Legge and Devore survey (1981-85) continued to hold top 20 positions, and none of the top 10 programs from the 1981-1985 period dropped out of the top 20. Of the top 10 institutions, Har- vard experienced the greatest fall, moving 16 places from 4th to 20th. Other top 20 schools that fell significantly included Texas A & M (from 20th to 37th tie), George Mason (from 11th to 3 1st), and Connecticut (from 14th to 39th tie).

A number of schools dramatically improved their productivity ratings. Three schools that did not receive any ratings in either of the first two studies were now in the top 20. These programs were Colorado at Denver (16 tie), San Diego State (18), and Baltimore (19). In addition, California at Berkeley, which was not ranked in the Legge and Devore study but was ranked 5th in the Morgan et al. survey, was rated 13th. Overall, 21 programs not listed in the Legge and Devore study now showed up in the rankings. Other schools that made important improvements in ranking included SUNY-Albany (from 13th to 6th), Missouri at Columbia (from 21st tie to 9th), Rutgers (from 32nd tie to 23rd), North Carolina (from 48th tie to 24th), and Auburn (from 43rd to 25th).

Faculty, Graduate Student, and Graduate Productivity in Public Administration 435

This content downloaded from 185.2.32.109 on Sun, 15 Jun 2014 08:32:30 AMAll use subject to JSTOR Terms and Conditions

Page 5: Faculty, Graduate Student, and Graduate Productivity in Public Administration and Public Affairs Programs, 1986-1993

The majority of the ranked programs produced far more public administration than public policy articles. Only three programs (Arizona State, Washington, and Michigan) produced a larger number of public policy articles, and only Syracuse yielded more than ten policy articles. Surprisingly, Harvard, which generated far more policy than administration articles in the earlier studies,6 turned out few policy articles this time around. Overall, the high production of public administration articles relative to public poli- cy articles was probably due to the greater number of public administration journals being examined in the study.

Unlike the Legge and Devore (p. 150) study, which found "a tremendous surge in productivity from the 1970-1980 period," increases from the 1981-1985 period were found to be more mod- est. Table 2 is used to compare program production in the listed journals over the 1981-1985 and 1986-1993 periods. To adjust for the difference in the number of years, I multiplied the 1981- 1985 totals by 1.6. North Carolina at Chapel Hill experienced the largest increase (96 percent). However, few programs increased their productivity by more than 50 percent. In addition, several programs had decreases of 40 percent or more. For the most part, the majority of the ranked programs earned moderate increases, and few top schools saw their productivity decline. Some of the increases may be attributable to the addition of the Journal of Pub- lic Administration Research and Theory to the list of journals under study. It may also be possible that journals have increased the number of articles they include within each volume. A more plau- sible explanation might be that changes for individual programs are the result of a gain or loss of key faculty members or perhaps a shift in orientation toward journals that are more specific to certain subfields. The latter explanation would seem to support the idea that the field of public administration is becoming more special- ized (Stillman, 1991).

Legge and Devore speculated that program productivity may be a function of faculty size. Table 3 replicates their efforts to show the productivity of programs as a ratio of the number of faculty.7 Program totals were divided by the "number of full-time faculty teaching in program" as listed in the 1988 Directory, Programs in Public Affairs and Administration of the National Association of Schools of Public Affairs and Administration (NASPAA).8

The data in Table 3 show that the ratings of several programs with small faculty sizes improved when per capita measures are employed. Louisiana State, Wisconsin at Madison, Arkansas at Little Rock, Wyoming, and North Carolina State were the most striking instances. In addition, 3 top 10 schools with large facul- ties from Table 1 fell out of the top 25 in Table 3. These included Southern California (from 2nd to 38th), Indiana (from 4th to 60th), and SUNY-Albany (from 6th to 30th). However, most of the top programs in Table 1 did not experience dramatic drops. Of the top 25 programs in Table 1, 16 remained in the top 25 positions in Table 3.

Table 3 also reveals how the ratings have changed since the 1981-1985 period. Eight of the top 10 schools from the earlier time period managed to remain in the top 20. Four (Georgia, Vir- ginia Tech, Missouri at Columbia, and George Washington) con- tinued to hold positions in the top 10. However, only 3 other pro- grams listed in the Legge and Devore study (Syracuse, Southern Illinois, and South Carolina) remained in the top 25. This consti-

tuted a turnover of 14 schools. Five of these schools (Louisiana State, Wisconsin at Madison, Arkansas at Little Rock, Wyoming, and North Carolina State) have worked their way into the top 10. Most of these programs had relatively small faculties. This may be the result of a high degree of movement of faculty members from program to program, a process that would enable smaller institu- tions to rise swiftly in the rankings by attracting a highly produc- tive scholar.

Graduate Student Productivity Although measuring the journal productivity of a program's

faculty provides some evidence of the quality of faculty members, it does not provide much information about how well the program trains its students to become productive scholars. One way to do this is to measure the productivity of graduate students. Table 4 displays the top 25 schools when ranked by the number of publica- tions produced by graduate students between 1986 and 1993 in the listed journals.9 The same methodology employed above was used to derive the publication totals. However, programs were awarded credit only for articles authored by current graduate stu- dents.'0 Schools were given credit when the journals indicated that an author was a graduate student in a public administration, gov- ernment and public affairs, or political science program. Unfortu- nately, it was not always possible to ascertain whether a particular author was a graduate student. As a result, the totals in Table 4 may slightly underestimate actual publication rates.

Interestingly, 6 of the top 10 programs when ranked by overall faculty productivity also appeared among the top 10 for graduate

Table 3 Productivity as a Ratio of the Number of Faculty and Ranking, 1986-1993

Overall Ranking,

School Articles Faculty Ratio 1981-85

1. Georgia 53.08 12 4.42 2 2. Louisiana State 12.17 3 4.06 --- 3. Virginia Tech 31.75 8 3.97 1 4. Missouri, Columbia 23.33 6 3.89 7 (tie) 5. Wisconsin, Madison 13.17 4 3.29 --- 6. Arkansas, Little Rock 5.92 2 2.96 --- 7. Wyoming 11.83 4 2.95 --- 8. North Carolina State 18.50 7 2.64 --- 9. Syracuse 41.17 17 2.42 12 10. George Washington 23.97 10 2.40 9 11. Kansas 18.00 8 2.25 7 (tie) 12. Penn State, Harrisburg 15.09 7 2.16 --- 13. Florida State 27.70 13 2.13 4 14. South Florida 14.31 7 2.04 15. Baltimore 15.41 8 1.93 16. Penn State 12.86 7 1.84 6 17. Southern Illinois 9.00 5 1.80 16 18. California, Berkeley 18.67 11 1.70 --- 19. Connecticut 10.00 6 1.67 3 20. San Diego State 16.58 10 1.66 --- 21. Rutgers 14.17 9 1.57 --- 22. James Madison 7.50 5 1.50 -- 23. South Carolina 12.83 9 1.43 14 24. Texas Tech 7.11 5 1.42 -- 25. North Carolina 14.09 10 1.41 --

436 Public Administration Review * September/October 1996, Vol. 56, No. 5

This content downloaded from 185.2.32.109 on Sun, 15 Jun 2014 08:32:30 AMAll use subject to JSTOR Terms and Conditions

Page 6: Faculty, Graduate Student, and Graduate Productivity in Public Administration and Public Affairs Programs, 1986-1993

Table 4 Number of Graduate Student Publications, 1986-1993

Graduate School Student Publications

1. George Washington 6.0 1. Virginia Tech 6.0 3. American 3.0 4. California, Berkeley 2.5 4. Georgia 2.5 6. Ohio State 2.0 7. Syracuse 1.83 7. Kansas 1.83 9. SUNY, Stony Brook 1.5 9. North Carolina 1.5 9. Florida State 1.5 9. St. Louis 1.5 9. Southern California 1.5 14. TexasA & M 1.17 15. Cornell 1.0 15. Northern Illinois 1.0 15. Northwestern 1.0 15. Rutgers 1.0 15. George Mason 1.0 15. Arizona State 1.0 15. New Mexico 1.0 15. Pennsylvania 1.0 23. Carnegie Mellon 0.92 24. North Texas 0.83 25. South Florida 0.67

student productivity. These schools included George Washington, Virginia Tech, Georgia, Syracuse, Florida State, and Southern Cali- fornia. Three other top 15 schools from Table 1 (American, Cali- fornia at Berkeley, and Kansas) appear in the top 10 in Table 4. Indiana was the only top 5 program in Table 1 to fall out of the rankings.

Another interesting point is that most of the institutions in Table 4 maintain doctoral programs. This is not a particularly sur- prising finding because Master of Public Administration programs are much less research oriented than doctoral programs. This is due to the high number of master's level students who plan to pur- sue careers as practitioners rather than scholars. They have less of an incentive to publish journal articles than doctoral students who are interested in pursuing academic careers. The high number of schools with doctoral programs listed in Table 4 may also be due to a higher level of resources available to programs that provide doc- toral educations.

One disappointing finding was the overall low number of grad- uate student publications in the listed journals between 1986 and 1993. Forty-four schools registered a total of only 54.06 graduate student publications. This amounted to an average of 1.23 publi- cations per school. When the more than 300 schools which regis- tered faculty member publications were considered, the average dropped to 0.17 graduate student publications per school. The top 20 schools in Table 1 accounted for 54.5 percent of these pub- lications, while the programs ranked between 21 and 54 accounted for 16.3 percent, and the unranked programs accounted for 27.6 percent. Given that the number of graduate student publications may be considered an indicator of a program's effectiveness at training students to perform academic research, these results raise a question about whether programs overall are adequately preparing

students to conduct scholarly research. However, programs that sustain highly productive faculties tend to do a better job than pro- grams with less productive faculties. The significance of this point will be addressed later.

Another method of measuring the effectiveness of a program is to examine how productive the students of that program have become at producing scholarly research once they have completed their doctoral education. The logic here is that schools which do a better job of providing students with the skills they need to con- duct quality research will generate a larger number of high-caliber scholars than schools which are less adept at this task. To con- struct such a measure, I recorded the productivity of individual faculty members following the same methodology employed to rank entire programs, that is, faculty members were given a score of 1 for each article they published in the listed journals (see note 1) during the period 1986 to 1993. From the list of authors com- piled (approximately 1,100), the 146 most productive professors were selected for study. These faculty members constituted all those authors who published more than two articles in the listed journals between 1986 and 1993. They made up roughly 13 per- cent of the total number of authors yet were responsible for almost 40 percent of the articles published. They also averaged 3.85 (3.26 public administration and 0.59 public policy) articles per author as compared to the 1.35 average (1.0 public administration and 0.35 public policy) for all 1,100 authors. The objective of this part of the study was to rank programs based on the number of top schol- ars (those with more than two publications in the listed journals between 1986 and 1993) that each institution produced. This part of the study did not examine authors who published two arti- cles or less. Ranking programs based on the total number of their graduates who published articles in the listed journals would have inflated the scores of programs that generated a large number of relatively unproductive graduates. In addition, determining the institutions from which all of these authors received their doctor- ates would have been exceedingly difficult.

Once a list of the top 146 authors was compiled, the schools from which they received their doctorates were identified."1 Pro- grams were then rated based on the number of top professors who graduated from their institutions and the number of publications produced by those professors. The number of top authors gradu- ating from a program was the primary measure of interest. The productivity score (the total number of publications generated by a program's top graduates in the listed journals for the period 1986 to 1993) of a program's graduates was used to break ties between schools that produced an equal number of top scholars. As a result, a program that generated a large number of top authors will rank higher than a program that yielded a smaller number of top authors but whose top graduates produced a larger number of pub- lications. Ranking programs in this manner, rather than simply providing productivity totals of program graduates, paints a clearer picture of the effectiveness of schools in producing top scholars. Given the modest number of top authors, productivity totals alone could be deceiving in that a single extraordinarily productive grad- uate could make it appear that a school has produced a significant quantity of elite scholars when in fact it has not. Relying on the number of top graduates as the primary unit of analysis alleviates this problem.

Faculty, Graduate Student, and Graduate Productivity in Public Administration 437

This content downloaded from 185.2.32.109 on Sun, 15 Jun 2014 08:32:30 AMAll use subject to JSTOR Terms and Conditions

Page 7: Faculty, Graduate Student, and Graduate Productivity in Public Administration and Public Affairs Programs, 1986-1993

The top publishers in the listed journals for the period 1986 to 1993 received their doctorates from 70 schools; an average of only two graduates per school. Table 5 shows that Syracuse is the most dominant school at producing top public administration scholars while Southern California finishes a distant second.'2 Eight of the top 15 schools (Syracuse, Southern California, Georgia, North Carolina, Ohio State, California at Berkeley, Florida State, and George Washington) ranked in the top 10 for number of graduate student publications (Table 4). All but 1 of these 8 schools (Ohio State) finished in the top 25 in the faculty productivity ratings (Table 1). In addition, all 8 finished in the top 15 for faculty pro- ductivity at least once in either the Morgan et al. or the Legge and Devore studies (7 finished in the top 10 and 6 finished in the top 5 at least once).

These results, coupled with the findings from Table 4, seem to indicate that the set of distinctly research-oriented university envi- ronments in the field of public administration, at least as measured by output in the listed journals, is relatively small and concentrat- ed. The programs that maintain faculties that are highly produc- tive seem to be more effective at generating top academicians than programs with relatively unproductive faculties. Students who attend these institutions work with faculty members who are busy conducting research. Being exposed to these professors, and some- times publishing articles with them, furnishes students with the motivation, skills, and knowledge essential to conduct scholarly research.

Unfortunately, both the number of scholars producing more than two articles and the number of graduate student publications in the listed journals were quite low. This finding seems to sup- port the conclusions of previous studies that public administration programs are doing a relatively poor job of training students to carry out scholarly research (White, 1986; and McCurdy and Cleary, 1984). However, the positive relationship between faculty productivity and these two variables may suggest routes to explore to improve the education of research scholars. Doing so will require future researchers to examine in more detail the characteris- tics of programs with both productive faculties and productive graduates and graduate students.

Top Scholars Data gathered for this study allow identification of the most

productive scholars in the listed journals during the period 1986 to 1993. These faculty members, their article totals, and their insti- tutional affiliations as of January 1995 are listed in Table 6.13 It must be emphasized that this ranking is based only on the listed journals, and that a large number of additional scholars are sure to have been extremely productive in other journals relevant to public administration. It must also be noted that small recording errors may have occurred in the data so it would not be appropriate to draw sweeping conclusions on the basis of this tabulation. The prominence of the scholars included in the list, nevertheless, sug- gests a significant degree of face validity for the journal productivi- ty measures employed both in this article and in the Morgan et al. and Legge and Devore studies.

Table 5 Programs Rated by the Number of Graduates with More than Two Publications, 1986-1993

Number Top Graduate of Top Publication

School Graduates Totals

1. Syracuse 17 71.40 2. Southern California 10 39.58 3. Indiana 6 18.5 4. Yale 5 28.75 5. Georgia 5 20.83 6. Michigan 5 17.17 7. North Carolina 5 15.33 8. Harvard 4 20.0 9. Ohio State 4 17.67 10. Oklahoma 4 15.0 10. NewYorkUniversity 4 15.0 12. California, Berkeley 4 13.0 13. Chicago 3 19.0 14. Florida State 3 13.13 15. George Washington 3 12.0 16. Virginia 3 10.83 17. Penn State 3 10.2 18. Claremont 3 9.5 19. SUNY, Binghampton 2 11.5 20. SUNY, Albany 2 9.0 21. Miami University 2 7.0 21. Johns Hopkins 2 7.0 23. Maryland 2 6.67 23. Kansas 2 6.67 25. Kentucky 2 6.16 26. Northwestern 2 6.0 26. Columbia 2 6.0 28. Princeton 2 5.5 28. Cornell 2 5.5

Conclusion This article has reported on measures of the productivity of fac-

ulties, graduate students, and graduates of public administration and public affairs programs. Faculty productivity totals show that the most productive public administration programs as measured

Table 6 Most Productive Scholars, 1986-1993

School Affiliate Scholar Publications as of January 1995

Robert Golembiewski 13.0 Georgia Robert Durant 8.58 Baltimore David Rosenbloom 8.5 American Patricia W. Ingraham 8.5 Syracuse Charles T. Goodsell 8.0 Virginia Tech Laurence O'Toole, Jr. 7.58 Georgia Gregory B. Lewis 7.5 American Barry Bozeman 7.17 Georgia Tech Susan A. MacManus 7.13 South Florida Norton S. Long 7.0 Missouri, St. Louis/

Virginia Techa

professor Long was a faculty member at Missouri, St. Louis, and Virginia Tech at the time of his death in 1993.

438 Public Administration Review * September/October 1996, Vol. 56, No. 5

This content downloaded from 185.2.32.109 on Sun, 15 Jun 2014 08:32:30 AMAll use subject to JSTOR Terms and Conditions

Page 8: Faculty, Graduate Student, and Graduate Productivity in Public Administration and Public Affairs Programs, 1986-1993

by the listed journals have tended to remain productive over time. Of these programs, two seem to stand out, Georgia and Syracuse. Both have repeatedly appeared in the top 10 in the faculty and stu- dent productivity ratings. In addition, Georgia is the only school to remain in the top 5 for the Morgan et al. productivity rankings, the Legge and Devore productivity and productivity ratio rank- ings, and each of the rankings presented in this article.

The remainder of the programs are less successful at holding their positions in the rankings. These less productive schools tend to move up and down the rankings in an unpredictable manner, perhaps the result of the movement of productive professors from program to program. Indeed, of the top scholars identified in Table 6, 8 have changed institutions at least once since the initial date of this productivity study. These results in conjunction with the finding that only 146 out of 1,100 authors published more than two articles in the listed journals between 1986 and 1993 indicates that relatively few faculty members are successful at pro- ducing large amounts of publishable material. As a result, the movement of a small number of productive scholars is able to alter the rankings dramatically.

Possibly the most interesting finding on an issue not explored in previous studies of scholarly production is the relationship between faculty output and graduate student achievement. The most productive schools tend to educate the most productive grad- uate students and graduates. Although this relationship is in no

way conclusive (only 11 journals were examined), it does offer a direction for future research. This question is especially important if doctoral programs are inadequately preparing their students to conduct research. Future studies may want to examine in depth those programs that are more successful at producing top students. This kind of investigation may enable researchers to identify the characteristics of those programs that lead to success, and may ulti- mately contribute to an overall improvement in the quality of scholarship. 14

Finally, the findings in this article may serve as an aid for prospective graduate students trying to decide which program to attend. The rankings reveal important information about where research is being performed, where graduate students tend to pub- lish, and what schools have tended to produce top scholars. While future students should not base their decisions about which school to attend on the rankings alone, data such as these do provide a good starting point for those committed to the serious pursuit of a scholarly career.

James W Douglas is currently a DPA student at the University of Georgia. He is also a graduate of the MPA program at the Uni- versity of Baltimore. Among his research interests are public finance, budgeting, and doctoral education in public administra- tion programs.

Notes

The author wishes to thank Tony Neuser for his helpful assistance. 1. Each article was assigned a value of 1. Fractions were apportioned for

articles with multiple authors. No credit was given for book reviews, let- ters to the editor, rejoinders, and the like. "Finally, only those authors who could be identified as being faculty members primarily involved in the public administration program were included in the rankings" (Legge and Devore, 1987; 155). Like Morgan et al. and Legge and Devore, authors were only counted if they could be identified as "faculty members in departments of public administration, political science, government and public affairs, or affiliat- ed with university government research bureaus. In cases of combined departments (e.g. Department of Government and Business Administra- tion, Department of Social Sciences), scores were awarded only to those who were listed in public administration or political science faculty" (Morgan etal., 1981; 668). While faculty may have moved to other institutions during the period under study, their publications were awarded to the school they were affil- iated with at the time of publication.

2. During the time period under consideration, The Bureaucrat changed its name to The Public Manager. Articles under both titles were included in the study. In addition, volume 12, numbers 3 & 4, 1993, of Policy Stud- ies Review were not available at the time of the study.

3. The Journal of Public Administration Research and Theory did not begin publication until 1991.

4. These totals are presented in endnotes rather than separate tables because excluding The Journal of Public Administration Research and Theory had only a marginal effect on the rankings.

5. When The Journal of Public Administration Research and Theory was exclud- ed, the productivity totals for the top 25 schools for 1986-1993 were as follows: 1) Georgia 49.41 14) Colorado-Denver 17.17 2) Southern California 42.00 14) Arizona State 17.17 3) Syracuse 36.00 16) Kansas 17.00 4) Indiana 32.42 17) San Diego State 16.58 5) Virginia Tech 31.75 18) Penn State-Harrisburgl5.09

6) SUNY-Albany 27.75 19) California-Berkeley 14.67 7) Florida State 26.20 20) Rutgers 14.17 8) George Washington 23.97 21) Auburn 14.08 9) Georgia State 23.81 22) Rider 14.0 10) Missouri-Columbia 23.33 23) Northern Illinois 13.67 11) American 22.5 24) Baltimore 13.58 12) Oklahoma 20.17 25) South Florida 13.31 13) North Carolina St. 17.50

6. The Morgan et al. survey recorded that Harvard produced 19.5 public policy articles and 3 public administration articles between 1970 and 1980. Legge and Devore reported that Harvard generated 16.32 public policy and 5.5 public administration articles between 1981 and 1985.

7. When The Journal of Public Administration Research and Theory was excluded, the rankings for the productivity of programs as a ratio of the number of faculty for 1986-1993 were as follows:

1) Georgia 4.12 14) South Florida 1.90 2) Virginia Tech 3.97 15) Southern Illinois 1.80 3) Missouri-Columbia 3.89 16) Penn State 1.77 4) Louisiana State 3.78 17) Baltimore 1.70 5) Wisconsin-Madison 3.29 18) Connecticut 1.67 6) Arkansas-Little Rock 2.96 19) San Diego State 1.66 7) Wyoming 2.95 20) Rutgers 1.57 8) North Carolina State 2.50 21) James Madison 1.50 9) George Washington 2.40 22) South Carolina 1.43 10) Penn State-Harrisburg 2.16 23) Texas Tech 1.42 11) Kansas 2.13 24) Auburn 1.41 12) Syracuse 2.12 25) Cleveland State 1.38 13) Florida State 2.02

8. Legge and Devore used the 1984 Directory, Programs in Public Affairs and Administration. To follow their approach as closely as possible, the most recent edition produced by the National Association of Schools of Public Affairs and Administration (NASPAA) was used. The 1988 edition was the last to list the number of faculty in each program. However, because the 1988 edition fell within the period under study, it was adequate for my purposes. NASPAA was contacted in an effort to find more recent

Faculty, Graduate Student, and Graduate Productivity in Public Administration 439

This content downloaded from 185.2.32.109 on Sun, 15 Jun 2014 08:32:30 AMAll use subject to JSTOR Terms and Conditions

Page 9: Faculty, Graduate Student, and Graduate Productivity in Public Administration and Public Affairs Programs, 1986-1993

data, but, unfortunately, they no longer collect this type of information. The 1988 edition failed to give totals for several programs. As a result, it was not possible to calculate a ratio measure for Georgia State, Rider, Washington State, Oklahoma State, and North Texas State.

9. Because The Journal ofPublic Administration Research and Theory does not indicate whether an author was a graduate student, excluding it from the analysis does not change the rankings in Table 4.

10. Programs only received partial credit for articles co-authored with a facul- ty member(s) from the same program. For example, if two faculty mem- bers and one graduate student published an article together, the program would be given a score of 0.33. Because of the low number of graduate student publications, totals were not divided into public administration and public policy categories.

11. Faculty members who did not receive their doctorates in either political science, public administration, or public policy were not used in this por- tion of the study. Only 2 of the 146 faculty members were discarded for this reason.

12. Excluding The Journal of Public Administration Research and Theory from

the analysis resulted in only two changes; Syracuse fell from 17 top gradu- ates to 16 and Ohio State fell from 4 to 3.

13. When The Journal of Public Administration Research and Theory was excluded from the analysis, the productivity totals for the top 10 scholars for 1986 to 1993 were as follows:

1) Robert T. Golembiewski 12.50 2) David Rosenbloom 8.50 2) Patricia W. Ingraham 8.50 4) Charles T. Goodsell 8.00 5) Gregory B. Lewis 7.50 6) Robert Durant 7.25 7) Susan A. MacManus 7.13 8) Norton S. Long 7.00 9) Norma M. Riccucci 6.83 10) Laurence O'Toole, Jr. 6.58

14. A good starting point for this type of research might be to examine the variables used by Ferris and Stallings (1988) to identify the characteristics of programs that are related to program reputation.

References

Directory, Programs in Public Affairs and Administration, 1988. Washington, DC: National Association of Schools of Public Affairs and Administra- tion.

Ferris, James M. and Robert A. Stallings, 1988. "Sources of Reputation Among Public Administration and Public Affairs Programs." American Review ofPublic Administration, vol. 18 (September), 309-323.

Forrester, John P. and Sheilah S. Watson, 1994. "An Assessment of Public Administration Journals: The Perspective of Editors and Editorial Board Members." Public Administration Review, vol. 54 (September/October), 474-485.

Legge, Jerome S. and James Devore, 1987. "Measuring Productivity in U.S. Public Administration and Public Affairs Programs 1981-1985." Admin-

istration and Society, vol. 19 (August), 147-156. McCurdy, Howard E. and Robert E. Cleary, 1984. "Why Can't We Resolve

the Research Issue in Public Administration4" Public Administration Review, vol. 44 (January/February), 49-56.

Morgan, David R. et al., 1981. "Reputation and Productivity Among U.S. Public Administration and Public Affairs Programs." Public Administra- tion Review, vol. 41 (November/December), 666-673.

Stillman, Richard J., II, 1991. Preface to Public Administration: A Search for Themes and Direction. New York: St. Martin's Press.

White, Jay D., 1986. "Dissertation and Publications in Public Administra- tion." PublicAdministration Review, vol. 46 (May/June), 227-234.

440 Public Administration Review * September/October 1996, Vol. 56, No. 5

This content downloaded from 185.2.32.109 on Sun, 15 Jun 2014 08:32:30 AMAll use subject to JSTOR Terms and Conditions