structural biology infrastructure for researchers...structural biology infrastructure for...

19
Structural biology infrastructure for researchers ESFRI Workshop – Monitoring of Ris, Periodic Update of Landmarks, Use of KPIs 19-20 November 2018, Milan Ondřej Hradil, CEITEC, Instruct Centre, Czech Republic European Research Infrastructure – Instruct is a Landmark project in the European Strategy Forum for Research Infrastructures

Upload: others

Post on 24-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

  • Structural biology infrastructure for researchers ESFRI Workshop – Monitoring of Ris, Periodic Update of Landmarks,

    Use of KPIs

    19-20 November 2018, Milan Ondřej Hradil, CEITEC, Instruct Centre, Czech Republic

    European Research Infrastructure – Instruct is a Landmark project in the European Strategy Forum for Research Infrastructures

  • Internal and external evaluations - reflection

    • Instruct-ERIC = distributed infrastructure, specific challenges

    • Evaluation at multiple levels (as demanded by stakeholders): • International – NO • EU/ESFRI – YES

    • Annual report to EC (ERIC Regulation requirement) • 5 year review (demanded by Instruct-ERIC statutes) • International Scientific Advisory Board • ESFRI Landmarks monitoring - ?

    • National – YES, together with related national level infrastructures, incl. Instruct component

    • Institutional – YES, given the distributed nature

  • EU / ESFRI level evaluation Instruct-ERIC evaluation:

    • Why: committment in Instruct-ERIC statutes • Periodicity: every 5 years (2014, 2019) • How: External peer review panel reporting to the Instruct Council • KPIs:

    • Strong visibility in the community • Demonstrable demand for services • Access provision to integrated structural biology infrastructure • Strong scientific output (publications/dissemination) • Key training achievements • Engagements with industry and supporting innovation • Instruct-ERIC evolving to meet future scientific and training needs • Delivers distinctive contribution to the research landscape; value for money

    • Main stakeholders in the evaluation: ERIC member-states

  • National level evaluation – example CZ Example from Czech Republic

    • Why: ranking of RIs, basis for funding decision • Periodicity: every 3-4 years (2014, 2017, next indicated for 2021) • How: international panel of experts for all RIs on national roadmap,

    organised by ministry • KPIs: see next slide • What is evaluated: national level infrastructure, incl. its involvement in ESFRI • Main stakeholders: government that decides on funding + ministry

  • Czech evaluation – KPIs Obligatory outputs

    Type of outcome Unit

    1. Publications from the RI´s activities created by

    the RI´s users (annually) number

    2. Publications from the RI´s activities the RI´s

    team members participated in (annually) number

    3. National / foreign users of the RI number / percentage

    4. Master students educated within the RI

    / subset from abroad number / number

    5. Ph.D. students trained within the RI / subset

    from abroad number / number

    6. Financial income from the national resources

    (public/private)

    amount (million

    CZK/year)

    7. Financial income from the foreign resources

    (public/private)

    amount (million

    CZK/year)

  • National level evaluation – example UK

    Example from United Kingdom • Why: ranking of scientific projects, basis for funding decision • Periodicity: 5 years (mid-term review by funders/research councils) • How: internal panel review by MRC with international expert panel

    members • KPIs: scientific output; training/skills development; scientific impact –

    unique services; technical development impact • What is evaluated: national infrastructure provision through RI; excellence

    vs competitors • Main stakeholders: government/reserac council

  • Other measures of success requested by Instruct members (national review)

    • What is Instruct’s visibility in the member country community; (hard to measure) • Is access enabling sciences across the EU partners that would not otherwise

    occur; (hard to measure) • Has Instruct initiated new research strategies to tackle ‘big challenges’; • Has Instruct positively influenced infrastructure developments in member

    states – would these have developed without Instruct; (hard to measure) • Has Instruct engaged with technology industry, supporting innovation; • Does Instruct delivery an important and distinctive role in the RI landscape; • Does Instruct have an investment model that offers impact and value for

    money; • Has Instruct stimulated new investment; • Has Instruct harmonized standards with other RIs.

  • Institutional level evaluation

    Example from CEITEC, Instruct-ERIC centre in the Czech Republic • Why: improvement of performance, quality management • Periodicity: annually since 2015 • How: panel of 4 internal + 1 external member • KPIs: usage, impact (publications, patents, …), budgets, … • What is evaluated: individual core facilities (part of Instruct-ERIC centre) • Main stakeholders: CEITEC management

  • CEITEC areas of assessment

    KPI / area of assessment

    1 Staffing

    2 Expenses and budget

    3 Access/Utilization

    4 Education and training

    5 Expertise diversity

    6 Quality assessment

    7 Communication and Promotion

    8 Impact/Outcomes – publications, patents, other results

    9 Other: Technical benchmarking, financial benchmarking, etc.

  • KPI: Publications arising from access provision

    Possible metric: % of papers acknowledging Instruct in relevant field

    Journal Impact factor

    No. Instruct publications 2015-2018

    Nature 41.577 6 Science 37.205 1

    Nature Chemistry 25.870 1 Nature Methods 25.062 1

    Nature Immunology 21.506 1 Nature Protocols 15.269 1

    Nature Chemical Biology 15.066 1 Molecular Cell 14.248 1

    Nature Communications 12.353 14 Science Advances 11.51 1

    EMBO Journal 9.792 1

    Table shows the 10 highest impact journals publishing Instruct-acknowledged papers – as a measure of scientific excellence

    2015: 53 acknowledged publications 2016: 62 acknowledged publications 2017: 63 acknowledged publications 2018 (to 21/8/18): 41 acknowledged publications

  • Instruct: Measured outcomes and impact - ACCESS Access to infrastructure: 1/7/15 to 30/6/18 442 proposals received, 16% rejected on scientific review.

    FR 35%

    UK 21%

    IL 11%

    NL 7%

    ES 7%

    IT 6%

    BE 6%

    CZ 4%

    DE 3%

    FI 0%

    ACCESS REQUESTS TO INSTRUCT CENTRES (BY

    COUNTRY) EM

    Xtallisn

    Protein prod

    Biophys

    Membrane

    NMR

    Imaging

    nanobodies

    X-ray and SAXS MS

    TOP 10 ACCESS PLATFORM TYPES REQUESTED

  • Instruct: Measured outcomes and impact - Awards Training courses

    FR 19%

    UK 18%

    DE 14%

    ES 11%

    USA 10%

    IL 5%

    NL 4%

    PT 3%

    GR 3%

    CH 2%

    Rest 11%

    % of participants

    FR 25%

    PT 15%

    ES 11%

    UK 9%

    IL 10%

    IT 8%

    CZ 8%

    NL 6%

    DE 4%

    GR 2%

    BE 2%

    Courses hosted

    FR

    PT

    ES

    UK

    IL

    IT

    CZ

    NL

    DE

    GR

    BE

  • Instruct: Measured outcomes and impact - Awards Internships: R&D Awards:

    UK, 5

    FR, 2

    NL, 2

    BE, 1

    ES, 1

    IL, 1

    IT, 1

    NUMBER OF INTERNS HOSTED In 2015: 39 applications were received; 7 awards made; total amount awarded €87,000; In 2016: 42 applications were received; 9 awards made; total amount awarded €90,000; In 2017: 60 applications were received; 7 awards made; total amount awarded €90,000; In years 2016 and 2017, the review panel recommended 29 proposals (rejected for R&D award) which were approved for access based on the science proposed.

  • KPIs vs. areas of assessment Internal monitoring

    KPIs

    • Only limited number!

    • Clear quantification

    • Shall we set targets up front?

    Areas of assessment

    • Wider scope and contextual picture

    • Space for qualitative assessment

  • Data collection for KPIs

    • Need to be established at the beginning

    • Centralised metrics collected by Instruct Hub – number of proposals, training outputs, publications acknowledging Instruct-ERIC, …

    • Otherwise rely on input from distributed Instruct centres

  • Expectations from ESFRI monitoring/periodic update

    • No additional bureaucracy • Reflect the areas of assessment for ESFRI roadmap (science +

    implementation) • Take account of different RI funding cycles/operational timeframes • Relevance of KPIs and output measures to different RI types and

    objectives: single sited / distributed RI; basic / applied science • Include scientific impact as key KPI • National and institutional evaluations taken into account • ESFRI review of RIs should be accepted as valid by national

    stakeholders

  • Suggestions for ESFRI monitoring/periodic update

    • Include ESFRI WG members in the Instruct-ERIC evaluation

    • Develop a single template evaluation that will satisfy all/many member state requirements

    • Evaluation should not aim to compare RIs – they are not comparable

    • Maintain scientific excellence as the overarching quality criteria

    • Consultation on methodology at advanced stage (with stakeholders such as ESFRIs)

  • www.structuralbiology.eu @instructhub

    Thank you

    http://www.structuralbiology.eu/

  • Central aspects for success of RIs in the long run – useful lessons

    • Sustainable finances: • 10 years into operational RIs on the ESFRI Roadmap – many different

    financial models are in existence; • Core funding from members sustains Hub activities (governance,

    organisation, legal etc) but not usually access; • Consider some form of core funding to underwrite sustainability for

    infrastructures; • Could be based on inclusion on the ESFRI Roadmap; • Funds ringfenced in the Horizon-Europe budget.

    • Finalise and implement an EC/ESFRI Long Term Sustainability action plan. • Political stability – BREXIT