dr amberyn thomas - university of queensland299460/... · indexed by web of science. these articles...
TRANSCRIPT
Dr Amberyn Thomas
Things that can be counted ◦ number of publications ◦ citation counts
Also: ◦ Journal quality measures – e.g. Journal impact factor ◦ h-index
Metrics: must always be considered in context – significant disciplinary differences, and differences over time
qual·i·ty/ˈkwälətē/ Noun • the standard of something as
measured against other things of a similar kind
• the degree of excellence of something
Definition of research used in the United Kingdom Research Assessment Exercise Research is defined as that which: ‘… includes work of direct relevance to the needs of commerce, industry, and to the public and voluntary sectors; scholarship; the invention and generation of ideas, images, performances, artefacts including design, where these lead to new or substantially improved insights; and the use of existing knowledge in experimental development to produce new or substantially improved materials, devices, products and processes, including design and construction. It excludes routine testing and routine analysis of materials, components and processes such as for the maintenance of national standards, as distinct from the development of new analytical techniques. It also excludes the development of teaching materials that do not embody original research.’ Here the term ‘scholarship’ has the particular meaning: ‘... the creation, development and maintenance of the intellectual infrastructure of subjects and disciplines, informs such as dictionaries, scholarly editions, catalogues and contributions to major research databases.’
Determined by peers – peer review AND/OR Inferred by “numbers” – bibliometric
indicators ◦ Citation counts etc An aside…..what do citation counts measure?
As a research quality or performance measure the assumption is: high citations = high quality BUT
“..not only the content of scientific work, but also
other, in part non-scientific, factors play a role in citing behaviour. Citations can therefore be viewed as a complex, multidimensional…phenomenon”
Lutz Bornmann, Hans-Dieter Daniel, (2008) "What do citation counts measure? A review of studies on citing behavior", Journal of Documentation, Vol. 64 Iss: 1, pp.45 - 80
Thomson Reuters Web of Knowledge ◦ Web of Science Citation reports, h-index, graphs and tables of publications per
year and cpp per year, refine and analyse feature ◦ Journal Citation Reports ◦ Essential Science Indicators ◦ ResearcherID Allows authors to claim publications as theirs (assert
authorship), citation analysis feature, collaboration metrics Scopus ◦ Citation counts from 1996, citation overview, h-index (post
1996), SCImago journal rank (SJR), SNIP etc. Google Scholar
Often criticised for poor quality control but better coverage in some disciplines
Publish or Perish (PoP) – GS interface
HERDC, ERA data etc Value added using ◦ APIs for retrieving Scopus and WoS citation counts ◦ WoS Web services – weekly download of pubs with
UQ in address field ◦ ResearcherID integration, etc
• Track record of individuals, groups of researchers • Contribution to the discipline and/or significance
of the contribution • Evidence of international profile • Evidence of capacity to collaborate effectively 2. The current environment… • research evaluation now part of the landscape
(ERA) – grant application reviewers, potential employers etc may expect to see metrics
Why use publication metrics?
• JIF – “measures” how often articles in journals are cited.
• In a given year, the impact factor of a journal is the average number of citations received per paper published in that journal during the two preceding years
• A = no. of times articles published in 2008 and
2009 were cited during 2010 • B = no. of "citable items" published by that journal
in 2008 and 2009. • 2010 impact factor = A/B
Flawed for a number of reasons….but it may be a useful tool to profile research output……
JIF – “measures” how often articles in journals are cited. E.g. the 2008 IF is the average number of citations received in 2008 by 2006 and 2007 papers. They may be a useful tool to profile your output.
Journal Impact Factor
•See also SCImago Journal Rank Indicator
Because citation rates vary between disciplines it is important to provide the context information
•Not an article-level metric. •Highly discipline specific. •Methodology flawed/skewed.
Number of publications Career citation count Citations per paper % cited (or % uncited) h index
These numbers tell us something…..but there’s a lot missing……
© dragoart.com
Number of publications ◦ Profile by document type, subject area, year, journal etc
Number of citations (career citation count) ◦ Provide an indication of coverage by citation data provider ◦ Are all outputs included or just articles and reviews?
Citations per paper
◦ I have 35 refereed journal articles, of which 33 are indexed by Web of
Science. These articles have received 230 citations, giving an average citation per (indexed) paper of 7 (source: WoS, 05/02/11).
Percentage not cited ◦ Of my 33 indexed journal articles, only 3 articles have not been cited by
others (9% not cited), and these were all published in 2010 (WoS, 05/02/11)
Citations for an individual paper or sub-set of publications: supporting evidence for outstanding contribution in a particular area
Citations per paper by year of publication and by journal/subject area/FOR code ◦ Can then benchmark E.g. Incites, or use TR Essential Science Indicators for
benchmarking Or create your own benchmarks using refine feature
within WoS
qual·i·ty/ˈkwälətē/ Noun • the standard of something as
measured against other things of a similar kind
• the degree of excellence of something
Give the numbers some meaning!
◦ I have 35 refereed journal articles, of which 33 are indexed by Web of Science. These articles have received 230 citations, giving an average citation per (indexed) paper of 7 (source: WoS, 05/02/11). ◦ I have 15 articles which exceed the expected
citation rate for the ESI Field of <ESI field> for their respective publication years, and 5 articles in the top 10 % by citations for this Field (source: Essential Science Indicators, 05/02/11)
The h-index (JE Hirsch) proposed in 2005 as a measure of research influence of a scientist "The index h, defined as the number of papers with
citation number greater than or equal to H, is a useful index to characterize the scientific output of a researcher." E.g. If your h-index is 15, you have 15 papers cited 15
times or more.
Productivity + Impact = Influence
Citation reports in Web of Science and Citation Tracker in Scopus calculate the h-index of a search result. ◦ You can search for the publications of an author,
research group, or institution and calculate the h-index. ◦ You must be aware of the difficulties of
comprehensively searching for the publications of an individual or group. ◦ No database lists all articles.
Use the h-index measure with care. Citation patterns vary across disciplines. ◦ E.g. h-indices in Medicine are much higher than in
Mathematics for example. ◦ Researchers in different disciplines can not be
compared using the h-index. Even within the same discipline the h-index
should not be used alone as a measure of research quality. Researchers in the same discipline at different stages of
their careers can not be compared using the h-index.
Researcher 1 cites per paper
Researcher 2 cites per paper
Researcher 3 cites per paper
Paper 1 100 50 15 Paper 2 90 50 15 Paper 3 70 45 14 Paper 4 50 45 14 Paper 5 35 5 12 Paper 6 4 4 4 Paper 7 0 2 4 Paper 8 0 0 4 Paper 9 0 0 4 Paper 10 0 0 4 Average cites per paper
35.3 20.1 9.0
My h-index based on these indexed papers is 10. I have 4 papers (A, B, C, D) with more than 20 citations and 1 paper (E) with 29 citations (source: WoS, 05/02/11). I also have an additional 3 papers not indexed by WoS, with 29 citations based on Scopus data (05/02/11).
Take care how the metrics are presented ◦ be specific and descriptive about what data you
present – check what it is you are presenting ◦ provide dates and data source(s) ◦ keep screen dumps for reference
Context is all important ◦ Use a range of metrics as appropriate to your
discipline and your publication history ◦ Define your discipline and use appropriate
benchmarks