bibliometrics, webometrics, altmetrics, alternative metrics
TRANSCRIPT
dans.knaw.nlDANS is an institute of KNAW en NWO
Bibliometrics, Webometrics, Altmetrics, Alternative metrics
A possible Zeno effect for science metrics, and why we nevertheless look for metrics?
Andrea Scharnhorstwww.knowescape.orgWorkshop “Alternative metrics or tailored metrics: Science dynamics for science policy”, November 9-10, 2016 Warsaw
EASY: https://easy.dans.knaw.nl/ui/home
Motivation
PhD on math models of science dynamics – measurement – scientometrics(e.g., # researcher in a field; # PhD students in a field)
Use of metrics in science policy – EastEurope in the mirror of bibliometrics – Matthew effect of countries (Bonitz)
New practices, new metricsWeb indicators for scientific, technological and innovation research – WISER 2002-5Academic Careers Understood through Measurement and Norms - ACUMEN 2011-14Impact-EV - Evaluation of SSH 2013-17
Visualisation of structure and evolution of scienceVisualising NARCISMapping Digital HumanitiesDigital Observatory for DH (Pilot)
Semantic web technologies - Open DataCEDAR Dutch Historic Census
New practicesResearch Data - FAIR
Growth of science and indicator systems – How metrics came about?
Growth of science and indicator systems – How metrics came about?
1950 1960 1970 1980 1990 2000 2010
NSF (1950)https://nsf.gov/statistics/ i.e., PhDs per field
OECD (1961)Frascati Manual 63
EuroCRIS (2002)CERIF Standard Data Model
VIVITI (1952)RZH
ISI (1960)WoK, Citation indexing
Altmetrics.com (2011)
VIVO Open source software/ontology for scholarship
wikipedia
Google Scholar (2004)
CASRAI (2006)Open standards RI, CA
Box model of research
Outputjournal articles; citation
impact; patents
InputHuman capital: authors; …. ?
students?
Expenditures: projects; ...?infrastructures?
Process
Tailored metrics or all-in metrics?
Perhaps counter-intuitively, when it comes to metrics more is not necessarily always better. When deciding what to record, you should picture yourself at operationally significant periods within the year like year-end, budget submission time, and month end, imagining the information you would ideally like to report upwards or use to make operational decisions for your department. For example a handy technique is to design your ideal annual departmental report and then work backwards asking whether at present you have the necessary data to produce the report.The annual report should talk to your firm’s strategic goals if it is to be effective and well received. Of course you won’t collect metrics solely for upward reporting to management, you’ll also collect metrics to help run your department better. Differentiate between external and internal metrics – those meant to help you and your team run things better, and those meant to communicate your value externally within the firm.
Peter Borchers, Managing Directorhttp://priorysolutions.com/articles/law-firm-library-metrics-aall-session-summary/
Metrics - What for?
Questions
To better understand science dynamics
To better monitor science dynamics
How have disciplines developed over centuries?Do innovation, institutionalisation, education operateon different time scales?What is the dynamic of the academic job market?
How much ‘small fields’ does an university need?How adequate are national portfolios to team science?
Impact of large scale infrastructure investment Who does re-use research data?
Blind spots – infrastructure and new fields
1-Jan-99 31-Dec-00 31-Dec-02 30-Dec-04 30-Dec-06 29-Dec-08 29-Dec-10 28-Dec-12 28-Dec-14 27-Dec-16
ExPoSe
From Digitization to Digital Humanities
Get inspiration
Evidence Analytics & Information Systems
But be aware
Local (geo, topic, institutional) science measurement
Global, cross-domain, long-term
ResearchInformation Systems
Not all measurement should be pursuit on all levels of granularity and all time!Up-scaling comes with a price!
Take away
Understanding Monitoring
Combine qualitative and quantitative research
Make sure to refer to standard data models – re-use ontologies
RI data are ‘just’ data – use the FAIR principles (findable, accessible, interoperable, re-usable)
When experimenting with new Research Information Systems communicate where they are located (local-global; incidental-long-time;….)
Communicate about error margin’s, uncertainty and ambiguity – visualise!
References
Godin, B. (2005). Measurement and statistics on science and technology: 1920 to the present. London: Routledge.
Godin, B. (2001). The Emergence of Science and Technology Indicators: Why Did Governments Supplement Statistics With Indicators? (No. 8). Montreal. Retrieved from http://www.csiic.ca/PDF/Godin_8.pdf - (annex: NSF indicators (scores/feasibility), considered by not recommended)
Priem, J., Taraborelli, D., Groth, P., & Neylon, C. (2010). Alt-metrics: a manifesto. October. Retrieved from http://altmetrics.org/manifesto/
Diana Hicks, & Wouters, P. (2015). The Leiden Manifesto for research metrics. Use these ten principles to guide research evaluation... Nature, 520(7548), 9–11. doi:10.1038/520429a
Borgman, C. L. (2015). Big data, little data, no data: Scholarship in the networked world. Cambridge, Mass: MIT Press
Börner, K. (2010). Atlas of science: Visualizing what we know. Cambridge, Mass: MIT Press.
Börner, K. (2015). Atlas of knowledge: Anyone can map. Cambridge, Mass: MIT Press.
Wilkinson, M. D., Dumontier, M., Aalbersberg, I. J., Appleton, G., Axton, M., Baak, A., ... Mons, B. (2016). The FAIR Guiding Principles for scientific data management and stewardship. Nature, 3, 160018. DOI: doi:10.1038/sdata.2016.18
dans.knaw.nlDANS is an institute of KNAW en NWO
Thanks for your attention!
[email protected]@ScharnhorstA @knowescapeDans.knaw.nl