academic rankings. the competition for recognition. case ... · about the intrinsic value of...
TRANSCRIPT
Remaking the Social. New Risks and Solidarities
The First International Conference of the Society of Sociologists from Romania
Faculty of Sociology and Social Work, “Babeş-Bolyai” University Cluj-Napoca
2-4 December, 2010
Thematic session: Eleven years after Bologna: reform or crisis escalation?
Convener: Adrian Hatos, University of Oradea; Bogdan Voicu, Quality of Life Research
Institute – ICCV
Conference paper. Do not quote without author’s permission
Academic Rankings. The Competition for Recognition. Case Study:
Babeş-Bolyai University, Cluj-Napoca
Teodora Capota Babeş-Bolyai University
Abstract
In the last years, higher education in Romania is facing profound changes especially due
to the Bologna process reforms. The link between social expectations, public resources for higher
education and the uncertain economical environment underlines current and emerging issues of
higher education institutions such as the alteration of academic degrees value. Moreover, changes
both within and outside the higher education institutions are threatening the nature and makeup of
higher education -its students, faculty, curriculum, and functions. Nevertheless, academics are
trying to offer a strong educational background by setting standards in order to encourage
continuous improvement in the management of the quality of higher education.
A common sense synonym for a valuable diploma is the prestige of universities. One of
the tools that measure the activity of the institutions is the rank; it provides strong indicators of
impact and prestige of universities, summarizes their global performance, ensures information for
candidate students and scholars, and reflects the commitment to the dissemination of scientific
knowledge.
Although the quality of higher education institutions is rigorously operationalized, in
some cases the relevance of rankings is debatable. The objectives of each ranking might be
slightly different –whilst a global ranking seeks to identify world class universities, contributing
to the global progress of science, society and scholarship, a regional ranking should adapt to the
realities of the region in question. However, interest in rankings amongst Romanian higher
education institutions becomes increasingly visible. Therefore, it can be strongly argued that at
the academic level there is a real competition for recognition.
This paper aims, first, to explore in somewhat general terms what we mean by "academic
rankings", what their major dimensions may be; second, to contrast the Romanian rankings with
its counterparts in international rankings; and finally, to put into discussion the Babeş-Bolyai
University positioning in international and Romanian rankings.
Ranking universities: What for?
―This [..] will be an invaluable tool for you if you are bright, well educated, and aiming
for a successful career after university, or, if you are a parent wanting to give your son or
daughter an edge in his or her choice of university study‖ –is the statement made by the one of
the most publicized rankings of the world’s authors1. If we were to summarize the stated purpose,
we would say that rankings are created only to help students choose a ―quality‖ university. But is
that all? In addition to this, De Maret (2007: 36) considers that, in terms of utility, institutional
1 The 4th edition of the Top University Guide 2010, that includes the QS World University Rankings
rankings respond to some other needs such: the needs of scientists to know where to work; the
needs of governments to know where to invest and the needs of university leaders to know where
they stand. Since education has become a core element in a competitive world market, rankings
come to meet the demands of ―costumers‖ about the educational ‖products‖ and scientific
―services‖ offered by higher education institutions (Slaughter and Leslie, 1997); they function as
institutional marketing strategies (Buela-Casal et al., 2007: 350) and often as advertising
instruments; they stimulate competition among HEIs, guide allocations of public funds, help
differentiate among different types of institutions, programs and disciplines and contribute to the
definition of ―quality‖ of HEIs within a particular context (Berlin Principles on Ranking of
Higher Education Institutions, 2006). In the global education market where the customer and the
service provider could be geographically separated, ranks provide a global frame of reference
about the intrinsic value of institutions (Pathak and Pathak, 2010: 167), creating transparency
about the higher education system (Berghoff and Federkeil, 2009: 41). Other approaches
emphasize the fact that rankings are meant simply to feed public appetites for data on
institutional status (Marginson and Van der Wende, 2007: 319). Regardless of the stated (or not)
purposes they serve –continuous improvement, strategic planning, accountability, competitive
marketing, enhancement of performance and productivity promotion of prestige and visibility
(Prolux, 2007: 167)- rankings of higher education institutions have become a global
phenomenon.
The “Winner’s” Approach. Evidence from International Academic Rankings
To be ―the number one‖, ―the best‖ or ―on top‖ seems to represent a desirable attribute for
the players of various fields, including the educational field. For this to be possible, some kind of
a league –a national one, or an international one- to which one can only be admitted based on
performance need to exist, and performance –based listing of research and education or higher
education ranking follow the rules of evaluation. This means that the higher the performance, the
better chance a HEI has to occupy a higher position. Van Raan (2007: 87) considers that reaching
high-ranking positions implies becoming a member of an elite league. In fact, the small fraction
that represents the ―elite league‖ of the stratified higher education landscape has more visibility,
while the large base of the prestige pyramid is relegated more or less to obscurity (Zhao,
2007:323). However, a core feature of rankings is that they make distinction and allocate status
and prestige. But how do they work?
Put simply, university rankings are lists of institutions comparatively ranked according to
a common set of indicators in descending order. However, in a first phase it is important to
determine what to compare and how to assess: if goals or standards are set a priori and one’s
activity or achievements are compared to these goals or standards, we are in a context of absolute
assessment; if one’s performance is compared with that of one’s colleagues, we are in a context
of relative assessment (Vincke, 2009: 12). The ―Shanghai‖ ranking and the ―Times‖ ranking are
based on relative assessment since they report all institution’s performance to the performance of
the first ranked.
The “Shanghai” ranking
The first world ranking of universities was published in 2003 by the Center for World-
Class Universities and the Institute of Higher Education of Shanghai Jiao Tong University,
China, and then updated on an annual basis, the Academic Ranking of World Universities
(ARWU) aimed initially to find the global standing of Chinese top universities. At present, is one
of the internationally recognized academic ranking systems.
Regarding the ranking methodology, a first consideration has to be made on the selection
of universities: ARWU considers every university that has any Nobel Laureates, Fields Medalists,
Highly Cited Researchers, or papers published in Nature or Science. In addition, universities with
significant amount of papers indexed by Science Citation Index-Expanded (SCIE) and Social
Science Citation Index (SSCI) are also included. As for the scoring procedures, universities are
ranged by several indicators of academic or research performance; for each indicator, the highest
scoring institution is assigned a score of 100, and other institutions are calculated as a percentage
of the top score; scores for each indicator are weighted (see Table1.) to arrive at a final overall
score for an institution. The highest scoring institution is assigned a score of 100, and other
institutions are calculated as a percentage of the top score. An institution's rank reflects the
number of institutions that sit above it.
Table 1. Indicators and Weights for ARWU
Criteria Indicator Code Weight
Quality of
Education
Alumni of an institution winning Nobel Prizes and
Fields Medal
Alumni 10%
Quality of Faculty
Staff of an institution winning Nobel Prizes and
Fields Medals
Award 20%
Highly cited researchers in 21 broad subject
categories
HiCi 20%
Research Output
Papers published in Nature and Science* N&S 20%
Papers indexed in Science Citation Index-expanded
and Social Science Citation Index
PUB 20%
Per Capita
Performance
Per capita academic performance of an institution PCP 10%
Total 100%
* For institutions specialized in humanities and social sciences such as London School of
Economics, N&S is not considered, and the weight of N&S is relocated to other indicators.
Source: http://www.arwu.org/ARWUMethodology2010.jsp#1
The Shanghai index aggregates six different indicators:
Alumni indicates the total number of the alumni of an institution winning Nobel Prizes and Fields
Medals. Different weights are set according to the periods of obtaining degrees as it follows:
100% for alumni obtaining degrees after 1991, 90% for alumni obtaining degrees in 1981-1990,
80% for alumni obtaining degrees in 1971-1980, and so on, and finally 10% for alumni obtaining
degrees in 1901-1910.
Award indicates the total number of the staff of an institution winning Nobel Prizes in Physics,
Chemistry, Medicine and Economics and Fields Medal in Mathematics; different weights are set
according to the periods of winning the prizes: the weight is 100% for winners in after 2001, 90%
for winners in 1991-2000, 80% for winners in 1981-1990, 70% for winners in 1971-1980, and so
on, and finally 10% for winners in 1911-1920. If a winner is affiliated with more than one
institution, each institution is assigned the reciprocal of the number of institutions. For Nobel
prizes, if a prize is shared by more than one person, weights are set for winners according to their
proportion of the prize.
HiCi shows the number of highly cited researchers in 21 subject categories2.
N&S indicates the number of papers published in Nature and Science between 2005 and 2009. To
distinguish the order of author affiliation, a weight of 100% is assigned for corresponding author
affiliation, 50% for first author affiliation (second author affiliation if the first author affiliation is
the same as corresponding author affiliation), 25% for the next author affiliation, and 10% for
other author affiliations. Only publications of 'Article' and 'Proceedings Paper' types are
considered.
22 The definition of categories and detailed procedures can be found at the website of Thomson ISI.
PUB designates the total number of papers indexed in Science Citation Index-Expanded and
Social Science Citation Index in 2009. Only publications of 'Article' and 'Proceedings Paper'
types are considered. When calculating the total number of papers of an institution, a special
weight of two was introduced for papers indexed in Social Science Citation Index.
PCP marks the weighted scores of the above five indicators divided by the number of full-time
equivalent academic staff. If the number of academic staff for institutions of a country cannot be
obtained, the weighted scores of the above five indicators is used3.
The “Times” ranking
Times Higher Education first conceived its annual World University Rankings with QS in
2004 by identifying primary missions of world class universities: research quality, graduate
employability, teaching quality and international outlook and then it sought ways of measuring
each of these. In October 2009, QS and THE ended their collaboration under which THE was
licensed to publish the QS results known as ―Times Higher Education (THE) – QS World
University Rankings‖. QS continued to produce its own ranking as the QS World University
Rankings, keeping the methodology of the former THE -QS World University Rankings 2004-
2009 .
QS compiles a multi-index ranking, taking into account the following:
Academic Peer review is a method used to assess academic quality and represents a measure of
the average reputation for research of a given institution among academics in each of five broad
3 For ARWU 2010, the numbers of full-time equivalent academic staff are obtained for institutions in USA, UK,
France, Canada, Japan, Italy, China, Australia, Netherlands, Sweden, Switzerland, Belgium, South Korea, Czech,
Slovenia, New Zealand etc.
subject areas. An international online survey is conducted by asking academics to first identify up
to 30 institutions outside their own country (from a list of institutions in the region (s) they have
selected) and secondly up to 10 from their own country that they consider excellent in each
subject area they have selected4. Since academics are not asked to rank the institutions,
performance is based on the number of occurrences of each institution’s name. Responses are
weighted by region5 and compiled into five separate peer reviews for each of the five subject
areas which are combined with equal weighting to yield the final result.
Citations per faculty results from dividing the citations number (provided by Scopus) by the
number of faculty staff. For the calculation of this indicator, QS gathers two distinct datasets:
total citation count for the last five years and ―Full Time Equivalent‖ (FTE) faculty.
Student faculty ratio has been identified to address the stated objective of evaluating teaching
quality. For the calculation of this indicator, ―Full Time Equivalent‖ (FTE) students and ―Full
Time Equivalent‖ (FTE) faculty are taken into account.
The Recruiter review, as annual online survey, roughly resembles the peer review approach. It
covers the same subject areas and it is operated in a near identical way to a single subject area
peer review. Again, a regional weighting is applied to ensure equal representation from the 3
"super regions" of Americas; Europe, Middle East & Africa and Asia Pacific. Graduate
4 Respondents are asked to identify the subject area (s) with which they have most familiarity from the following:
arts and humanities, life science and biomedicine, natural sciences, social sciences, and engineering and IT.
55 For each subject area a regional weigthing is applied to ensure equal representation from the 3 regions of
Americas; Europe, Middle East & Africa and Asia Pacific.
employers come from a very diverse range of businesses in terms of scope, sector, size and
nature.
International faculty and International students, as scores based on proportions of full time
registered students and faculty that hold an overseas nationality, determine the international
outlook or reputation of higher education institutions.
Weightings are assigned by indicator as set out in Table 2. Prior to 2007, scores for each
indicator were scaled against the top performer on that measure: the leading institution was
awarded 100 and subsequent institutions’ scores scaled against that maximum. From 2007, a
balancing tool has been adopted: z-scores involve subtracting the mean score from each
individual score and then dividing by the standard deviation of the score. The percentile ranks of
the a-scores are then plotted using a standard normal distribution table.
Table 2. QS World University Rankings criteria vs. THE World University Rankings categories
QS World University Rankings THE World University Rankings
Criterion Indicator Weight Category Indicator Weight
Rese
arc
h q
uali
ty Academic Peer
review 40%
60%
Cit
ati
on
-
rese
arc
h
infl
uen
ce
Citation impact (normalized
average citations per paper) 32.5%
Rese
arc
h —
volu
me,
inco
me a
nd
rep
uta
tion
Reputational survey- research
Research income (scaled)
Papers per academic and research
staff
Public research income/ total
research income
19.5%
30%
5.25%
Citations per
faculty
20% 4.5%
0.75%
Teach
ing q
uali
ty
Student faculty
ratio 20%
Teach
ing —
the
learn
ing
en
vir
on
men
t
Reputational survey-teaching
PhD awards per academic
Undergraduates admitted per
academic
Income per academic
PhD awards/ bachelor’s awards
15%
30%
6%
4.5%
2.25%
2.25%
Grad
uate
em
plo
yab
ilit
y
Employer
review 10%
Ind
ust
ry
incom
e-
inn
ovati
on
Research income from industry
(per academic staff)
2.5%
Inte
rn
ati
on
al
ou
tlo
ok
International
faculty 5%
10%
Inte
rn
ati
on
al
mix
-
sta
ff a
nd
stu
den
ts
Ratio of international to domestic
staff
3%
5%
International
students 5%
Ratio of international to domestic
students
2%
Total 100% Total 100%
Since Times Higher Education has ended its arrangement with its former data provider
(QS), is no longer using the old rankings methodology. Instead, THE have announced they
intend to produce their own rankings that are developed in concert with a new rankings data
provider, Thomson Reuters. The 2010-2011 ranking will provide a top 200 list and six tables
showing the top 50 institutions by subject; the methodology is based on new criteria and
weightings: 13 separate performance indicators brought together into five headline categories
(see Table 2) were designed in order to capture the full range of university activities, from
teaching to research to knowledge transfer:
The “Teaching — the learning environment” category employs five separate indicators designed
to provide the sense of the teaching and learning environment of each institution, from both the
student and academic perspective: (1).a reputational survey on teaching -a worldwide poll of
experienced scholars which examines the perceived prestige of institutions in both research and
teaching; (2) the number of undergraduates admitted by an institution scaled against the number
of academic staff; (3) the ratio of PhD to bachelor's degrees awarded by each institution; (4) the
number of PhDs awarded by an institution, scaled against its size as measured by the number of
academic staff; (5) the measure of institutional income scaled against academic staff numbers.
The indicators of the ―Research — volume, income and reputation‖ category are: (6) a
reputational survey on research; (7) the university's research income, scaled against staff numbers
and normalized for purchasing-power parity; (8) a simple measure of research volume scaled
against staff numbers obtained by counting the number of papers published in the academic
journals indexed by Thomson Reuters per staff member; (9) a measure of public research income
against an institution's total research income.
―Citations — research influence‖ is measured by (10) the number of times its published work is
cited by academics The data are drawn from the 12,000 academic journals indexed by Thomson
Reuters' Web of Science database. The figures are collected for every university, with data
aggregated over a five-year period from 2004 to 2008. All the citations impact data are
normalized to reflect variations in citation volume between different subject areas.
A single indicator determines the ―Industry income — innovation” category: (11) the institution's
research income from industry scaled against the number of academic staff.
The final category named ―International mix — staff and students― employs two indicators: (12)
the ratio of international to domestic staff and (13) the ratio of international to domestic students.
To calculate the overall ranking score, "Z-scores" were created for all datasets. This standardizes
the different data types on a common scale and allows fair comparisons between the different
types of data. Each data point is given a score based on its distance from the average (mean) of
the entire dataset, where the scale is the standard deviation of the dataset. The Z-score is then
turned into a "cumulative probability score" to give the final totals.
Criticism
There are limitations to every ranking. The majority of rankings share some basic
methodological features:
- they compare whole universities (ignoring that they have different goals and missions and that
they are internally differentiated)–either exclusively or some also introduce comparisons of broad
discipline fields
-they aggregate the indicators6 into a single composite overall indicator of the quality of
an institution; the weights given to the single indicators as well as the indicators differ quite a lot
between rankings
-results are displayed in a league table with individual rank positions from first to last
(Berghoff and Federkeil, 2009: 42). Other limitations regard:
Education versus Researc:. focusing on competitive research and less on teaching and learning is
well known.
6 Glänzel and Debackere (2009: 66) emphasize that ranking is ―positioning comparable objects on an
ordinal scale based on a (nonstrict) weak order relation among (statistical) functions of, or a combination of
functions of measures or scores associated with those objects‖, and that this functions (usually based on variables for
evaluative purpose), are called indicators.
Size – Rankings have been criticised for favouring large, well funded institutions with an
emphasis on science, and most of its indicators are not adjusted for size.
Bias towards the Natural and Life Sciences – measures such as citation counts do favour
universities which are strong in the fields of medicine and natural sciences, where there is a
strong publishing and citation culture.
Anglo-Saxon bias – a common criticism of global rankings is that they favour universities which
publish in the English language, because most journals counted by bibliometric databases (counts
of papers and citations per university) are in English. In addition, Anglo-Saxon academics have a
greater culture of citing each other’s work than academics in other countries.
UBB in international rankings
Since there is no Romanian university in the top 500 of ARWU, a strategy to enter the top
500 universities in the world through the Shanghai ranking was launched by the Babeş-Bolyai
University in March 2006. Taking into account the scores obtained by UBB in terms of Shanghai
ranking indicators, should be noted that, until the strategy was launched, no employee or graduate
of UBB has received Nobel Prizes; no employee or graduate of UBB has received Field Medals;
no employee or graduate of UBB is included in the ―highly cited researchers‖ list; UBB staff
have only a few ISI contributions (including contributions in Science and Nature). Therefore, a
series of measures are proposed to stimulate the international visibility: financial rewards to staff
provided in recognition of their publications in ISI journals (rewards for publications in Science
and Nature; rewards for articles published in journals indexed in ISI Science Citation Index
Expanded, Social Sciences Citation Index; Arts and humanities Citation Index; rewards for
articles in journals indexed in international databases that perform a selection process of
publications); academic promotion conditioned by ISI publications and publication in
international databases; to organize a set of activities in order to familiarize the UBB staff with
the ISI system (workshops, publication of a ―Understanding ISI‖ volume); to support the entrance
of UBB publications in the ISI system; to introduce an internal quality indicator which takes into
account the weight of ISI publications per faculty with a fundamental role in the distribution of
basic financing from the budget; to streamline the research by reducing the bureaucracy
associated with the management of projects funded by grants; to employ the criteria targeting
scientific prestige in allocating internal resources for research and in internal accreditation of
research centers; to assure university subscription to scientific journals databases, which are
accessible without password inside the UBB; to create information systems in order to simplify
interdisciplinary collaborations between faculty (Agachi and Bucur, 2007: 73-79). The initial
strategy is reinforced in 2008 by a package of measures that includes: adaptation of scientific
research policy to international evaluation criteria; adaptation of new personnel policies to
international evaluation criteria; adaptation of the doctoral program at need to increase the
internationally recognized scientific production; orientation of all scientific research institutes to
the criteria of productivity and international recognition; raising the status of University
publications and publishers; informing the faculty and departments on the evolution of the
criteria, procedures, results of international academic rankings and recommending appropriate
actions (Mărcuş et al., 2009: 19-20)
Regarding the position occupied by the Babeş-Bolyai University in the THE -QS World
University Rankings, should be mentioned that until 2008 UBB was not taken into account. In
2008, the University has caught the attention of the THES authors by registering on the website
www.topuniversities.com and by creating its own profile. The 2009 THES ranking and the 2010
QS ranking place the Babeş-Bolyai University at the 601+ position.
National Rankings
A first Romanian initiative to complete a ranking of universities according to certain
performance criteria took place in 2000. It proposed a hierarchical system based on seven criteria
obtained by aggregating indicators, each with specified weight, being taken into account issues
related to input, process and output. Since then, several bodies in charge of higher education and
scientific research (CNCSIS7, CNFIS
8, Ad Astra) have proposed various methodologies for
evaluation and / or classification of universities in Romania.
CNFIS and CNCSIS
The National Higher Education Funding Council and the National Council for Quality
Management in Higher Education are two structures with an advisory role for the Ministry of
National Education, Research, Youth and Sport, with responsibilities in distributing funds to state
universities in Romania. CNFIS proposes and applies the basic funding methodology (personnel
costs and material costs) and additional funding (for housing and food subventions, endowments,
investments and repairs, scientific research), while CNCSIS is the main institution funding the
scientific research and graduate studies. The basic financing of the Romanian higher education is
based on a quantitative component (number of students) and a qualitative component (quality
indicators). By specific formulas, the two aspects are expressed in number of full-time equivalent
7 Consiliul Naţional al Managementului Calităţii în Învăţământul Superior, in translation National Council
for Quality Management in Higher Education
8 Consiliul Naţional pentru Finanţarea în Învăţământului Superior, in translation National Higher Education
Funding Council
students, and relative quality indicators. Their value is calculated separately for each university,
according to the reported information. Quality criteria have been introduced in the financing
algorithm in 2000, by 2008 reaching 30% of the total financing value (Mărcuş et al., 2009: 145).
CNFIS differentially distributes budgetary funds to state universities in Romania based on quality
indicators (see Table 3) and on the institutional performance reported in the previous year. For
2008 13 indicators grouped into four groups were used, while for 2010 CNFIS proposes a set of
16 indicators.
Table 3. Set of quality indicators for 2010 (CNFIS proposals- December 2009)
Group Sub-group Qualitative indicators Weights
2010 2009 2008
I T
each
ing s
taff
(8
,5%
)
I.A. Quality of the
teaching staff (4%)
Q.I.1 Ratio of teaching staff to full time
equivalent students (budget and tax) 3,00% 3,00% 4,00%
Q.I.2 Ratio of professors to full time
equivalent students (budget and tax) 1,00% 1,00% 0,00%
I.B. Potential
development of the
teaching staff (4,5 %)
Q.I.3 Ratio of associate professors to full time
equivalent students (budget and tax) 1,00% 1,00% 1,00%
Q.I.4 Ratio of teaching staff with a doctoral
degree to full time equivalent students
(budget and tax)
1,50% 1,50% 1,50%
Q.I.5 Ratio of teaching staff under 35 years to
full time equivalent students (budget and tax) 2,00% 2,00% 2,00%
II T
he I
mp
act
of
scie
nti
fic
rese
arc
h o
n t
he
did
acti
c
proce
ss (
9%
)
II.A. Performance level in
scientific research (7 %)
Q.I.6 Performance level in scientific research
(Index with complex structure) 7,00% 7,00% 7,00%
II.B. means of
disseminating the
scientific research results
(2%)
Q.I.7 The percentage of master and doctorate
full time equivalent students (for 2009 and
further studies in liquidation) within the total
number of FTE students
1,00% 1,00% 0,00%
Q.I.8 Ratio (percentage) of research and
design contracts and the total university
income
1,00% 1,00% 1,00%
III
Infr
ast
ru
ctu
re
(3,5
%)
III.A. Quality of the
infrastructure (2,5%)
Q.I.9 Costs for equipment and investments
related to the number of students (budget and
tax), except for students enrolled in distance
education
1,50% 1,50% 1,50%
Q.I.10 The amount of material expenses
related to the number of students (budget and
tax), except for students enrolled in distance
education
1,00% 1,00% 1,00%
III.B. Quality of the
means of documentation
(1%)
Q.I.11 Costs for the purchase of books,
journals and books, to the number of students
(budget and tax), except for students enrolled
in distance education.
1,00% 1,00% 1,00%
IV U
niv
ersi
ty m
an
agem
en
t (9
%)
IV.A. Quality of
academic, administrative
and financial
management (7%)
Q.I.12 Weight of investment expenditure by
the university budget allocations for this
purpose
0,50% 0,50% 0,00%
Q.I.13 Total quality of academic and administrative management (Index with
complex structure)
3,00% 3,00% 3,00%
Q.I.14 Weight of income from sources other
than budgetary allocations in the total
university income
2,00% 2,00% 2,00%
Q.I.1 Weight of income from sources other
than budgetary allocations spent on
institutional development in the total
university income
1,50% 1,50% 3,00%
IV.B. Quality of student
social and administrative services (2%)
Q.I.16 Quality of student social and
administrative services (Index with complex structure)
2,00% 2,00% 2,00%
V.
Lif
elo
ng
Lea
rn
ing
(in
20
09
:
0%
)
Q.I.17Development of lifelong learning in
universities 0,00% 0,00% 0,00%
Total 30,00% 30,00% 30,00%
QI6 has a special status in the financing methodology as an indicator which has a complex
structure. For calculating Q16, universities report data to CNCSIS, not to CNFIS (as happens in
the case of other dimensions.). QI6 2010 is obtained by the aggregation of five general principles
-with specific weights -regarding the assessment of quality of scientific research activities in
universities:
1. Ability to attract funds for scientific research
2. Ability to prepare highly qualified human resource for scientific research
3. The relevance and visibility of the results of the research activity
4. The ability of academics to design / develop products - innovative technologies for
business
5. The institutional capacity of universities to conduct and support scientific research
performance
CNFIS and CNCSIS do not publish annual hierarchies as happens to the majority of
ranking systems. However, in 2008, CNFIS has released a report entitled Analysis of the quality
indicator QI16 evolution on the performance of universities in scientific research and its
influence in the distribution of budgetary allocations for basic funding. This report analyzes the
evolution of QI16 and its sub-indicators in the period 2003-2007 for Romanian higher education
state institutions. Five value groups have been established, plus two additional for universities art
and architecture universities. Over the five years, UBB was always the first group. On the other
hand, CNCSIS initiated a study entitled Ranking of Romanian universities in terms of scientific
research activity. According to the rankings established on the basis of the QI16 eight sub-
indicators and using data reported annually by the universities in 2004-2006, the Babeş-Bolyai
University ranked second in 2004 and 2005 and climbed to first place in 2006.
Ad Astra Ranking
Ad Astra association is a non-profit organization founded in 2002 to promote the
Romanian scientific research, which includes Romanian researchers working in the country and
abroad. Since 2005, the association publishes Romanian universities’ ranking -a ranking achieved
on the basis of articles published by academics of Romanian higher education institutions in
internationally recognized scientific journals indexed by ISI Web of Science. The third and most
recent edition of the ranking is the 2007 edition that includes an overall ranking, scientific fields
ranking that reflects the scientific performance of academics. All the universities in Romania,
both state and private, were taken into account. The overall ranking was based on relating the
total number of items to the number of teacher staff. The ranking by scientific area was done by
distributing scientific articles published by universities in scientific fields, according to the
journal in which they were published, and adding the impact factor of journals in which
universities have published articles. A journal’s impact factor is an approximate measure of the
prestige and quality of scientific journals. The distribution of journals by scientific areas has been
made also by the ISI. All articles indexed in 2006 written by authors from Romania were
extracted from ISI database and then distributed by institutions based on the addresses given by
the authors in the articles. An article by authors from several institutions has been considered as a
whole article for each institution. Only publications such article, which presents original
scientific results, and no other publications such as letters, reviews were considered Some
journals are considered by ISI as belonging to several fields; an article published in this journals
was accounted as a whole article for each journal’s area. Some areas have been grouped so that
the list of fields used in the tables is close to the areas in which Romanian undergraduate studies,
can be organized according to current legislation. (Ad Astra, 2008, Romanian universities’
ranking, pp. 12-13) The main limitation of this ranking is that it only evaluates the scientific
performance of teaching staff and no other factors indicating the quality of education.
In the overall ranking, the Babeş-Bolyai University occupies the second position.
In the ranking by scientific area, the Babeş-Bolyai University occupies the 1st position in
psychology, theology and mechanical engineering; 2nd place in physics, chemistry, computer
science, biology, geology, economics and business, sociology, political sciences and journalism
and materials engineering; 3rd
place in environmental science, nuclear technology, biomedical
engineering and multidisciplinary engineering; 4th place in chemical engineering and 5th place in
medicine and pharmacy.
What rankings should…
Given the fact that the rankings of HEIs have become not only a way of legitimizing
institutional value at global level, but also a part of the framework of national accountability and
quality assurance process, the International Ranking Expert Group (founded in 2004 by the
UNESCO European Centre for Higher Education in Bucharest and the Institute for Higher
Education Policy in Washington, D.C.) proposed a set of principles of quality and good practice
in HEIs rankings—the Berlin Principles on Ranking of Higher Education Institutions. These
were grouped into four categories, as follows:
A)purposes and goals of rankings
- be one of a number of diverse approaches to the assessment of higher education inputs,
processes, and outputs
- be clear about their purpose and their target groups
- recognize the diversity of institutions and take the different missions and goals of
institutions into account
- provide clarity about the range of information sources for rankings and the messages each
source generates
- specify the linguistic, cultural, economic, and historical contexts of the educational
systems being ranked
B)design and weighting of indicators
- be transparent regarding the methodology used for creating the rankings
- choose indicators according to their relevance and validity
- measure outcomes in preference to inputs whenever possible
- make the weights assigned to different indicators (if used) prominent and limit changes to
them
C)collection and processing of data
- pay attention to ethical standards and the good practice recommendations articulated in
these Principles.
- use audited and verifiable data whenever possible
- include data that are collected with proper procedures for scientific data collection
- apply measures of quality assurance to ranking processes themselves
- apply organizational measures that enhance the credibility of rankings
D) presentation of ranking results
- provide consumers with a clear understanding of all of the factors used to develop a
ranking, and offer them a choice in how rankings are displayed.
- be compiled in a way that eliminates or reduces errors in original data, and be organized
and published in a way that errors and faults can be corrected (Berlin Principles on
Ranking of Higher Education Institutions, 2006).
Conclusions
Higher education institutions all over the world manifest a ―fatal attraction‖ to academic
rankings. In our opinion a there is a fundamental gap between HEIs that occupy a position in
rankings as result of their policies established independently from the ―ranking phenomenon‖ and
Higher education institutions that make a goal from this, adjusting their institutional strategies in
order to take part to the ―reputational race‖ or ―competition for recognition‖. However, the need
to adapt to international standards is not to blame. It is a must! It remains clear that in the
globalization era ―think global and act local‖ has to become ―think global and act global‖.
References
Agachi, Paul Şerban and Bucur, Ioan (eds.) (2007). Politica cercetării ştiinţifice la
Universitatea Babeş-Bolyai. Cluj-Napoca: Presa Universitară Clujeană.
Berghoff, Sonja and Federkeil, Gero (2009). The CHE approach. In C.Dehon, D. Jacobs
and C. Vermandele (eds.) Ranking universities. Bruxelles: Editions de L’Université de Bruxelles,
pp. 41-63.
Berlin Principles (2006). Berlin Principles on Ranking of Higher Education Institutions.
http://www.ihep.org and < http://www.cepes.ro >.
Buela-Casal, Gualberto, Gutiérrez-Martínez, Olga, Bermúdez-Sánchez, María Paz,
Vadillo-Muñoz, Oscar (2007). Comparative study of international academic rankings of
universities. Scientometrics, 7 (3): 349–365.
De Maret, Pierre (2007). Universities in the World: What for?. In J.Sadlak and N.C. Liu
(eds.) The World-Class University and Ranking: Aiming Beyond Status. Cluj-Napoca: Cluj
University Press, pp. 31-38.
Glänzel, Wolfgang and Debackere, Koenraad (2009). On the ―multi-dimensionality‖ of
ranking and the role of bibliometrics in university assessment. In C.Dehon, D. Jacobs and C.
Vermandele (eds.) Ranking universities. Bruxelles: Editions de L’Université de Bruxelles, pp.
65-75.
Marginson, Simon and Van der Wende, Marijk (2007). To Rank or To Be Ranked: The
Impact of Global Rankings in Higher Education. Journal of Studies in International Education
11: 306-329
Mărcuş, Andrei, Gherghin, Gelu, Szabó, Melinda and Zaharie, Monica (2009). Ghidul
competitivităţii şi calităţii. Cluj-Napoca: Presa Universitară Clujeană.
Merisotis, Jamie P. (2002) Summary report of the invitational roundtable on statistical
indicators for the quality assessment of higher/tertiary education institutions: Ranking and league
table methodologies, Higher Education in Europe, 27: 475–480.
O’Leary, John, Quacquarelli, Nunzio and Ince, Martin (2010). Top Universities Guide
2010. London: QS Quacquarelli Symonds Limited.
Pathak, Virendra and Pathak, Kavita (2010). Reconfiguring the higher education value
chain. Management in Education 24: 166-171.
Prolux, Roland (2007). Criteria for Ranking Universities with Affiliated Components. In
J.Sadlak and N.C. Liu (eds.) The World-Class University and Ranking: Aiming Beyond Status.
Cluj-Napoca: Cluj University Press, pp.167-174.
Slaughter, Sheila and Leslie, Larry L. (1997). Academic Capitalism: Politics, Policies,
and the Entrepreneurial University. Baltimore: The Johns Hopkins University Press.
Van Raan, Anthony F.J. (2007). Challenges in the Ranking of Universities. In In J.Sadlak
and N.C. Liu (eds.) The World-Class University and Ranking: Aiming Beyond Status. Cluj-
Napoca: Cluj University Press, pp. 87-121.
Vincke, Philippe (2009). University rankings. In C.Dehon, D. Jacobs and C. Vermandele
(eds.) Ranking universities. Bruxelles: Editions de L’Université de Bruxelles, pp. 11-25.
Zhao, Chun-Mei (2007). Building World-Class Universities: Some Unintended Impacts of
University Ranking. In The World-Class University and Ranking: Aiming Beyond Status. Cluj-
Napoca: Cluj University Press, pp. 321-331.
Websites
http://www.topuniversities.com/university-rankings/world-university-rankings/home,
16/Aug/2010
http://www.timeshighereducation.co.uk/world-university-rankings/, 18/Aug/2010
http://www.arwu.org/ARWU2010.jsp, 25/Aug/2010
http://www.cncsis.ro/Public/cat/22/Indicatorul-IC6.html, 15/Sep/2010
http://www.cnfis.ro/index_f.html, 15/Sep/2010
http://www.ad-astra.ro/universitati/clasamentul_universitatilor_2007.pdf, 17/Sep/2010