acknowledgements - university of plymouth€¦  · web viewthe report also notes that 47...

78
REF 2014, HE Pedagogic Research and Impact Final report - October 2015 [Pedagogic research] ‘was valued highly in our submission (about half higher education, half other phases) perhaps because I have worked in all phases and value it all and see much theoretical work as generic although contexts produce different environments in which to test out and experience theory in practice. But actually I know other departments could have submitted HE pedagogy but didn’t value it, both in our uni and elsewhere …’ ‘Despite our hard work of our team on pedagogic research and publications over many years, the Uni's approach was pretty much to dismiss it. I worked long and hard with the Education professors building the narrative for that UoA but couldn't get our publications taken seriously. Only one pedagogic researcher from a faculty did get into the Education UoA.’ ‘I just don’t know about impact I feel very ambivalent about it because I do appreciate all the arguments about why it’s important but I think once you start measuring something you start defining it in ways that the rigidity is not always helpful.’ ‘And I think one of the issues with impact, I remember going back a bit now, reading through this stuff, the Guidance on impact in the REF, and it was saying if it impacts just within HE then we're not particularly interested in that, we're looking at impacts outside HE, and of course that's always tricky isn't it for pedagogic research to demonstrate?’ 1

Upload: lamtuong

Post on 19-May-2018

214 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

REF 2014, HE Pedagogic Research and ImpactFinal report - October 2015

[Pedagogic research] ‘was valued highly in our submission (about half higher education, half other phases) perhaps because I have worked in all phases and value it all and see much

theoretical work as generic although contexts produce different environments in which to test out and experience theory in practice. But actually I know other departments could have

submitted HE pedagogy but didn’t value it, both in our uni and elsewhere …’

‘Despite our hard work of our team on pedagogic research and publications over many years, the Uni's approach was pretty much to dismiss it. I worked long and hard with the Education

professors building the narrative for that UoA but couldn't get our publications taken seriously. Only one pedagogic researcher from a faculty did get into the Education UoA.’

‘I just don’t know about impact I feel very ambivalent about it because I do appreciate all the arguments about why it’s important but I think once you start measuring something you start

defining it in ways that the rigidity is not always helpful.’

‘And I think one of the issues with impact, I remember going back a bit now, reading through this stuff, the Guidance on impact in the REF, and it was saying if it impacts just within HE then

we're not particularly interested in that, we're looking at impacts outside HE, and of course that's always tricky isn't it for pedagogic research to demonstrate?’

Professor Pauline Kneale Professor Debby Cotton Dr Wendy Miller

1

Page 2: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

2

Page 3: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Acknowledgements

This project was funded by the HE Academy, ASC-2053, to explore the ‘Research Excellence Framework Impact’ rules on the submissions of higher education pedagogic research to Education (Unit 25).

The research was undertaken by the authors, through the Pedagogic Research Institute and Observatory (PedRIO) at Plymouth University. The research involved desk-based analysis and interviews with 15 HE colleagues at 13 HE Institutions, with various roles and perspectives in relation to the REF and pedagogic research.

The research team is very grateful to all colleagues who gave their time to explain their engagement with the REF processes.

Abbreviations

ECU Equality Challenge UnitFTE Full time equivalentGoS Guidance on SubmissionsHE Higher EducationHEFCE Higher Education Funding Council for EnglandHEI Higher Education InstitutionNSS National Student SurveyNUS National Union of StudentsOERs Open Educational ResourcesOFFA Office for Fair AccessPedRIO Pedagogic Research Institute and Observatory, Plymouth UniversityQAA Quality Assurance AgencyQR quality-related researchRAE Research Assessment ExerciseRCUK Research Councils UKREF Research Excellence FrameworkSP Sub PanelSHRE Society for Research into Higher EducationUoA Unit of AssessmentUC UnclassifiedUUK Universities UK

3

Page 4: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Contents

Acknowledgements.......................................................................................................................................3

Abbreviations................................................................................................................................................3

Tables............................................................................................................................................................5

Executive Summary.......................................................................................................................................6

1 Introduction and research approach..........................................................................................................9

1.1 Context: the REF processes...............................................................................................................11

1.2 UoA 25 Education and Panel C..........................................................................................................12

1.3 Assessing Research Impact................................................................................................................13

2 Findings....................................................................................................................................................15

2.1 Main characteristics of UoA25 submissions.......................................................................................15

2.2 Selection Strategies...........................................................................................................................16

2.3 Submission of HE-related outputs.....................................................................................................17

2.4 Development of Impact case studies.................................................................................................19

2.5 Type of impact claimed......................................................................................................................22

2.6 Evaluating Impact..............................................................................................................................23

2.7 The future..........................................................................................................................................25

3 Discussion................................................................................................................................................26

4. Recommendations...................................................................................................................................31

5 References................................................................................................................................................33

Appendix 1. Interview schedule..................................................................................................................36

Appendix 2. Profiles for UoA25 ordered by 4*...........................................................................................38

Appendix 3. Profile details..........................................................................................................................40

Appendix 4. Example impact case studies from higher education...........................................................47

Durham University: Changing educational practice through ‘Threshold Concepts’.................................47

University of Edinburgh: Enhancing learning, teaching and assessment at university............................49

Open University: The impact of the National Student Survey: changing the behaviour of institutions, teachers and students.............................................................................................................................51

Plymouth University: Selection of doctors to specialty training on the basis of aptitude........................52

University of Sheffield: Developing Higher Education in Further Education colleges.............................53

4

Page 5: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Tables

Table 1. Interviewee profiles.......................................................................................................10

Table 1. Overall quality profile: Definitions of starred levels ......................................................11

Table 3. Sub-panels within Main Panel C ...................................................................................12

Table 4. Comparison of Education overall profile with other subject groupings.........................13

Table 5. Average of UoA25 overall profiles by HEI characteristics...............................................15

Table 6. Example impact areas ....................................................................................................22

Table 7. Impact case studies which were not submitted to REF 2014..........................................28

5

Page 6: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Executive Summary This project explores issues surrounding the submission of and value given to higher education (HE) pedagogic research within the 2014 Research Excellence Framework. It explores with those involved in the submission of the Education Unit of Assessment (UoA), and other stakeholders the following research questions:

• What proportion of the impact case studies submitted through UoA 25 in the 2014 REF were targeted at each sector?

• How does the proportion of case studies in HE relate to the proportion of outputs?• In what ways and to what extent did REF guidance about impact case studies affect submissions?• In what ways and to what extent do stakeholders believe that the REF impacts on HE research?

The impact case study rules specified the number of case studies required according to number of Category A staff submitted (FTE), and that impacts needed to be wider in reach than the submitting higher education institution. Concerns have been voiced that this may have skewed Education UoA submissions away from HE pedagogy.

The research included an exploration of outputs submitted to the Education UoA in REF2014 and analysis of the impact case studies submitted. This was followed by fifteen interviews with UoA co-ordinators and other stakeholders with an interest in pedagogic research.

The desk-based study of the UoA 25 submission to REF2014 indicates that:

Of the 76 HEIs submitted to the Education REF in UoA25 (Education), the most successful submissions came from Russell Group Universities, those who had a member of staff on the REF assessment panel, and pre-1992 universities;

In relation to other UoAs in panel C, the education submission overall had a higher than average proportion of 4* outputs and impacts but also a higher than average proportion of 1* and 2* submissions;

An estimate of the proportion of HE-related outputs in the whole UoA25 submission gives a minimum level of 9% of total submissions (far lower than other education sectors, e.g primary, secondary etc.);

HE outputs were published in a total of 122 journals, with 50% of these published in ten journals; Of the 106 named research groups in the Education UoA, only five explicitly include Higher

education or HE in their title (less than 5%); In terms of impact case studies primarily focusing on HE, the proportion of the whole sample is

estimated at 8% (17 of 216 impact1 case studies on the REF database).

The raw data indicate that HE research formed a relatively low proportion of both the submitted research and the impact case studies in the Education UoA.

There are several competing explanations about why there was a low proportion of HE pedagogic research submitted into this UoA. It may be that such research is limited in quantity, or that it rarely meets the quality threshold for REF submission. Interviews with UoA co-ordinators revealed that some

1 218 impact case studies were submitted to UOA25. However, two of these were marked as ‘not for publication’ and therefore do not appear in the submissions which have been published on the REF website and are not included here.

6

Page 7: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

felt that quality was an issue, with pedagogic research often being small-scale and localised. However, other respondents believed that pedagogic research was consistently under-valued by those putting together such submissions, and in some cases by university management. Six concerns were raised by staff engaged in pedagogic research, as well as by some of the UoA co-ordinators. These focused on the following points:

• Credibility problems for pedagogic research with university or faculty colleagues – the idea that HE pedagogic research was ‘not REFable’ remained widespread, and may have influenced submissions;

• The potential tension between research which will have impact on practice and high quality academic outputs – as in other vocational disciplines, there was a tension between the need to share innovations and evaluation with other practitioners, and producing highly academic research outputs;

• Local political issues around entering individuals from outside the School of Education. Most submissions were co-ordinated from within Schools of Education, but much pedagogic research happens outside of them, leading some respondents to feel that they were either overlooked or deliberately ignored;

• Contractual issues with some pedagogic researchers not on academic contracts – a number of respondents reported that staff on teaching-only or non-academic contracts had REFable pedagogic research outputs which were not included because the HEI was unwilling to change the contract to enable inclusion, reinforcing perceived research/teaching divides;

• The need to keep the submission ‘simple’ and ‘safe’ – the need to develop a coherent environment statement may have led to the exclusion of some pedagogic research outputs which were felt to fall outside of the traditional education research areas of schooling or teacher education;

• The inhibiting effect of HEFCE definition of eligible impacts – most respondents felt that the exclusion of impacts on HE students in the submitting institution was unfair and this rule may have unfairly disadvantaged pedagogic researchers since the target of such research is often the potentially very large number of students in their own institution.

Overall, the findings suggest that concerns about pedagogic research and REF continue to be raised, and that the addition of the ‘impact’ element has not alleviated these. With the sector moving towards the use of more teaching-only type contracts, these issues may become worse before they get better.

Recommendations arising from this research are made at four levels:

Individual pedagogic researchers are advised to collaborate with education researchers both within their own and other UK and international institutions

HEIs are advised to remain open to moving staff between different contract types to enable their REF submission to contain high quality pedagogic research

The HEA can continue to support pedagogic research activity in HE through events and networks, and can feed into consultations on the Teaching Excellence Framework for some metric of resourcing of pedagogic research

HE policy makers more broadly could strengthen efforts and support for the UoA25 panel to ensure representation of HE pedagogic research.

7

Page 8: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

8

Page 9: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

1 Introduction and research approach

Higher Education (HE) in the UK – and internationally – has come under increasing pressure to deliver research and teaching which provide value for money to the taxpayer. In the research realm, there is mounting interest in assessing the public benefits of research undertaken in universities, and this has become, in a period of fiscal austerity, a major source of political debate. Research funding in HE in the UK is managed through the ‘dual support’ system. Funding for research amounting to around £3bn is distributed via the Research Councils (RCUK), through grants for specific projects and programmes. In addition, funding is disbursed directly to institutions as a result of research evaluation exercises conducted at around 6 year intervals.

The Research Excellence Framework (REF 2014) was the most recent evaluation point; a process designed to assess the quality of research in UK HE which built on previous rounds of research assessment in the UK (e.g. the Research Assessment Exercise – RAE – in 2008). The results of these assessments determine the allocation of ‘Quality-Related’ (QR) funding (worth £1.6bn in 2015/16) to UK HE institutions. In total, 154 higher education institutions took part in REF 2014, with 1,911 submissions across all units of assessment (UoA), comprising 52,061 FTE academic staff, 191,150 research outputs and 6,975 impact case studies. The number of institutions submitting to each UoA ranged from 14 (Civil and Construction Engineering) to 101 (Business and Management Studies). A total of 76 institutions submitted to UoA25, Education, making this a fairly competitive UoA.

This project investigated the role of HE pedagogic research within Education Unit of Assessment (UoA25) submissions to REF2014. It explores, with those involved in the submissions to this UoA, and other stakeholders, the following research questions:

• What proportion of the impact case studies submitted through UoA 25 in the 2014 REF were targeted at each sector?

• How does the proportion of case studies in HE relate to the proportion of outputs?

• In what ways and to what extent did REF guidance about impact case studies affect submissions?

• In what ways and to what extent do stakeholders believe that the REF impacts on HE pedagogic research?

In particular, we were seeking to investigate individuals’ experiences of undertaking HE pedagogic research and of having such research assessed for REF2014, and to learn about the experience of developing impact case studies for the REF. We were especially interested in the number of HE pedagogy publications, where publication takes place, and the effect of the rules for eligible impact case studies to UoA25 which embraces education at all stages from early years to life-long learning. Notably, these rules specified (i) the number of case studies according to number of Category A staff submitted (FTE), and (ii) that impacts needed to be wider in reach than the submitting higher education institution (HEI). Concerns have been voiced that this may have skewed UoA25 submissions away from HE pedagogy.

9

Page 10: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

The study spanned five months with a work programme of desk-based research, interrogating the REF submissions and impacts databases2, and a literature search which encompassed a range of sources from peer-reviewed journals to media reports on the REF process. This was supplemented by fifteen phone and face-to-face interviews with participants from 13 HEIs (see Table 1 below). This was an exploratory study, gathering qualitative data through a series of in-depth semi-structured interviews, using an interpretivist approach. The aim was to illustrate key issues from an insider’s perspective and, for this reason, purposive sampling was utilised, focusing initially on UoA Co-ordinators for Education submissions, identified through contact with UK School of Education research leads. These were followed by further interviews with selected stakeholders having interest and expertise in pedagogic research. (See Appendix 1 for interview schedules.) The latter group were identified primarily through networks including: the Association of National Teaching Fellows; the Heads of Educational Development Group (HEDG); the Higher Education Academy (HEA) pedagogic research network; and the Staff and Educational Development Association (SEDA). This group were able to provide valuable insights into the opportunities and challenges of undertaking pedagogic research in the context of REF. We also interviewed one university impact co-ordinator, one institutional REF co-ordinator and one REF assessor to gather some wider perspectives on REF and impact. Based on a small sample, this study does not make claims to generalisability, but rather illustrates some of the key issues which may nonetheless resonate with others working in this field. Further research would be of great interest in terms of exploring with a wider group, the provisional themes identified here.

Table 1. Interviewee profiles

UoA25 submitted HE Pedagogic Research Gender

HEI type (Old, New)3 REF Role

NA F N Institutional Impact OfficerYes F N Pedagogic researcher, HEDG memberNo F N Pedagogic researcher, HEDG member, NTFNo F N Pedagogic researcher, NTFYes F N UoA coordinatorYes F N UoA coordinatorNo F N UoA coordinatorYes F N UoA coordinatorNo F O Pedagogic researcherYes F O UoA coordinatorYes M N Institutional REF coordinatorNA M O External Consultant / REF assessorNo M O UoA coordinatorNo M O UoA coordinatorNo M O UoA coordinator

Transcripts were coded thematically according to the research questions and issues identified from the literature. Direct quotes used in this report are anonymised to preserve the confidentiality of the participants.

2http://ref.ac.uk/ and http://impact.ref.ac.uk/CaseStudies/ 3 Old university = pre-1992, New university = post-1992

10

Page 11: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

1.1 Context: the REF processes

The first Research Assessment Exercise (RAE) was undertaken in 1989, with others thereafter in 1992, 1996, 2001 and 2008 (McGettigan 2013). The process and the systems for allocation of research funding are constantly evolving, and REF 2014 was no exception. Key processes followed closely those of the RAE in 2008, albeit with minor differences (including the process for submission of early career researchers and those who submitted fewer outputs on equality grounds). Participating HEIs submitted research outputs (publications, portfolios, patents, etc.) alongside supporting statements about the research environment for peer review. For the first time, statements detailing the impact of the HEI’s research were required through an ‘impact template’ and accompanying case studies. All submissions were then assessed in subject groupings, by four main panels (A-D), with sub-panels for each specific subject, or Unit of Assessment. The submissions were read and graded by Panel members and assessors on a scale of 4* to unclassified (UC) to give an overall profile for each UoA (Table 1).

Table 2 - Overall quality profile: Definitions of starred levels

Four star Quality that is world-leading in terms of originality, significance and rigour.

Three star Quality that is internationally excellent in terms of originality, significance and rigour but which falls short of the highest standards of excellence.

Two star Quality that is recognised internationally in terms of originality, significance and rigour.

One star Quality that is recognised nationally in terms of originality, significance and rigour.

Unclassified(UC)

Quality that falls below the standard of nationally recognised work. Or work which does not meet the published definition of research for the purposes of this assessment.

The components making up the overall profile were: (i) outputs which accounted for 65% and were assessed on criteria of ‘originality, significance and rigour’; (ii) impact, which accounted for 20% and was assessed on ‘reach’ and ‘significance’; and (iii) environment which accounted for 15% and was assessed on ‘vitality and sustainability’ (REF 2012). Academics on the sub-panel were responsible for assessing the outputs to the REF, whilst research users and academics together assessed the impact statements and case studies (REF 2014).

REF 2014 was a multi institution effort, involving the UK research councils, participating HEIs and research user organisations from across the UK. Widespread academic and popular debate surrounds both the purposes and processes of such research evaluation exercises (Gilroy 2009, Burrows 2012, Dean et al. 2013, Reborra and Turri 2013, Khazragui and Hudson 2015, Jump 2015b). According to Sayer (2014) the REF is not fit for purpose for the following reasons: it costs too much; it is not effectively peer reviewed (compared to journal standards); it undermines collegiality; it discourages innovation; and it is redundant, given that the Russell Group and former 1994 Group universities were allocated almost 85% of QR funding.

11

Page 12: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

1.2 UoA 25 Education and Panel C

The sub-panel (SP) for Education came within the remit of Main Panel C. The submissions for RAE2008 and REF2014 are shown in Table 3. The number of individuals submitted to the 2014 Education UoA dropped by nearly 15% from RAE 2008, with 76 HEIs submitting compared to 81 in the previous iteration (a decrease of 6%). The UoA25 sub-panel report states that, “Institutions appear to have been strategic in their entry policies, and comparison with HESA data suggests a relatively high degree of selectivity in many submissions” (REF 2014, p103). The report also notes that 47 submissions were counted as ‘small’ (less than 15FTE), more than twice the average for other subpanels – which is probably to be expected, reflecting the vocational nature of Schools of Education provision.

Table 3: Sub-panels within Main Panel C (REF 2014)

UoANo. of

submissionsCat A FTE

% change in Cat A FTE

No. outputs

Impact case

studies

Total Panel C submissions2014 614 14,413 -2.8 52,212 2,0402008 604 14,834 58,494

16Architecture, Built Environment and Planning

2014 45 1025 -0.9 3,781 1462008 61 1034 4,361

17Geography, Environmental Studies and Archaeology

2014 74 1686 +3.4 6,012 2392008 75 1630 6,729

18 Economics and Econometrics2014 28 756 -9.8 2,600 1012008 35 838 3,037

19 Business and Management Studies2014 101 3320 -5.1 12,204 4322008 104 3497 13,159

20 Law2014 67 1553 -7.1 5,525 2252008 67 1671 6,262

21 Politics and International Studies2014 56 1275 +0.5 4,367 1812008 59 1269 4,714

22 Social Work and Social Policy2014 62 1302 +4.7 4,784 1902008 68 1243 5,271

23 Sociology2014 29 704 -24.1 2,630 1002008 39 927 3,729

24Anthropology and Development Studies

2014 25 562 +6.4 2,015 802008 29 528 2,069

25 Education2014 76 1442 -15.0 5,526 2182008 82 1696 7,146

26Sport and Exercise Sciences, Leisure and Tourism

2014 51 790 +58.0 2,759 1282008 39 500 2,015

In terms of outcomes, Table 4 illustrates that the average overall profile for Education UoAs compares well with other disciplines. Education had a relatively large return from 76 Institutions, with the percentage of 4* papers and impact statements above average for Panel C. However, the education submissions also contained a higher than average proportion of 1* and 2* profiles and impact case studies, suggesting a wide spread of quality.

12

Page 13: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Table 4: Comparison of Education overall profile with other UoAs

Unit of AssessmentAverage

Overall profile Impact4* 3* 2* 1* UC 4* 3* 2* 1* UC

16 Architecture, Built Environment and Planning 29 40 25 6 0 38.4 42 15.3 3.6 0.7

17 Geography, Environmental Studies and Archaeology 27 42 26 5 0 34.3 42.2 19.3 3.9 0.3

18 Economics and Econometrics 30 48 19 2 1 36.3 44.7 14.1 3.4 1.5

19 Business and Management Studies 26 43 26 4 1 37.7 42.5 17 2.2 0.6

20 Law 27 46 23 4 0 38.3 41.1 17.7 2.4 0.521 Politics and International

Studies 28 40 26 6 0 40 44.2 13.1 2.7 0

22 Social Work and Social Policy 27 42 25 5 1 43.8 36 14.9 4.1 1.2

23 Sociology 27 45 26 2 0 43.2 39.4 13.6 3.3 0.524 Anthropology and

Development Studies 27 42 26 4 1 40.8 43.2 11.3 3.8 0.9

25 Education 30 36 26 7 1 42.9 33.6 16.7 6 0.826 Sports and Exercise

Sciences, Leisure and Tourism

25 41 27 6 1 39.2 32.4 21.8 6.3 0.3

Average of Panel C UoAs 28 42 25 5 1 39.5 40.1 15.9 3.8 0.7

1.3 Assessing Research ImpactThere have been ongoing debates about assessing research impact, for reasons of funding, accountability and to gauge the broader benefit to societies (Donovan 2011, Penfield et al. 2014). As with other processes of research evaluation, although there is broad agreement on the need for assessing impact, many people voice concerns both about the purpose and the means especially as a determinant of funding streams (Watermeyer 2014, Jump 2015b, McNay 2015). As Smith et al. (2014, p1369) note, “there are three main lines of controversy: the threats to academic autonomy implied in the definition of expert review and the delimitation of reviewers, the scope for boundary-work in the construction of impact narratives and case studies, and the framing of knowledge translation by the stipulation that impact ‘builds on’ research.” However, assessing impact is becoming part of the UK and international research landscape, exemplified by the UK Research Councils’ ‘Pathways to Impact’ toolkit resources to help funding applicants and assessors identify likely outcomes of financial allocations (available at http://www.rcuk.ac.uk/innovation/impacts/ ). It is clear that impact is multi-faceted, can be reached through different means, and that many challenges remain in determining causal route(s) from research to impact (Phillips 2012).

It is not known whether the requirements and rules for impact case studies in REF 2014 had a contributing role in HEI strategies that resulted in a lower number and size of submissions to Education UoA25. What is known, however, is that – across the board – the inclusion of impact case studies involved

13

Page 14: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

many hours of additional work, often with the involvement of central research support staff or external consultants (Nicholson 2015, technopolis 2015). Analysis has also shown that there were more submissions than would be statistically expected of a size just below the threshold for an additional impact case study (Horne reported in Jump 2015a).

For REF 2014, examples of the many ways of achieving and evidencing impact were given in the Guidance on Submissions (GoS), with emphasis on the impact beyond academia (REF 2011, p26). The GoS detailed how the reach and significance of an impact would be assessed:

“For the purposes of the REF, impact is defined as an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia.”

Academic impacts were considered to be already accounted for in the outputs and environment elements of REF, and thus for the purposes of the impact element, paragraph 143 stated that:

“Impacts on research or the advancement of academic knowledge within the higher education sector (whether in the UK or internationally) are excluded.”

And of particular significance to UoA25:

“Impacts on students, teaching or other activities within the submitting HEI are excluded.”

Other impacts within the higher education sector, including on teaching or students, are included where they extend significantly beyond the submitting HEI.

The GoS further stated that the focus of assessment was to be the impact of the submitted unit’s research rather than the impact of individuals or individual research outputs (REF 2011). Assigned weightings within the impact component were of 20 per cent to the impact template, describing the unit’s approach, and 80 per cent to the case studies. The processes of generating impact case studies and impact templates within Education UoAs, as within others, meant that subject sub-fields had to be identified, selected, with narratives constructed and evidence documented.

Previous research focusing on the preparation of impact case studies across all disciplines, concluded that many impacts of research were not included in REF submission:

‘The 6,975 impact case studies submitted to REF 2014 are not fully representative of the range of impact occurring across the entire HEI sector. The case studies prepared for the REF had to follow a specific format and set of rules. This is likely to have resulted in many types of impact not being able to be submitted. However, it is worth noting in this context that the REF is not intended to be an inclusive exercise and the publications submitted do not aim to represent all research conducted across the sector.’ (Manville et al., 2015, p. 36)

Given the longstanding concerns around pedagogic research and the RAE/REF (Yorke, 2000; Cotton & Kneale, 2014), as well as on impact more widely, this project aimed to investigate the processes involved in developing REF 2014 UoA25 submissions, and consider whether HE pedagogic research was under-represented.

14

Page 15: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

2 Findings

In this section, we explore the publicly-available REF datasets and draw out the prevailing views of our interview participants, whilst remaining open to the likelihood of alternative perspectives, and being aware of the limitations of the sample.

2.1 Main characteristics of UoA25 submissionsNearly 50% (76 out of the total of 154) of HEIs submitting in the REF included a submission for UoA25 Education. The profiles of these are shown in Appendix 2. Analysis of these profiles was carried out to explore suggestions in literature (Sayer 2014, Frankl et al. 2014, Jump 2015b) that REF confirms the existence of ‘the golden triangle’ of excellence in UK research, shown in Table 5.

Table 5. Average UoA25 overall profiles by HEI characteristics

Institution characteristicFTE Category A staff

submitted 4* 3* 2* 1* UC

Russell Group average 36.7 41.9 38.1 17.6 2.1 0.4Panel member representation average

24.6 35.8 41.5 19.0 3.5 0.3

UoA25 average 18.9 19.9 36.3 31.3 11.1 1.3Neither Russell Group or Panel member

12.4 11.8 34.1 37.4 14.9 1.8

pre-1992 average28.2 31.9 43.1 21.6 3.1 0.3

post-1992 average10.7 9.2 30.2 40.1 18.4 2.2

It can be seen from Table 5 that universities in the Russell Group, those with members on the panel and pre-1992 universities had both a higher than average size of submission and a higher than average percentage of 4* and 3* profiles in UoA25, as in other disciplinary areas (e.g. Psychology and Social Work and Social Policy) and as found from previously analyses of the REF and RAE exercises (see e.g. McNay 2003, Sharp and Coleman 2005, McNay 2015). Conversely, HEIs not within the Russell Group and without Panel member representation had lower than average size of submissions and a higher proportion of 2*, 1* or unclassified profiles.

McNay (2015) notes that analysis of the effect of the contribution of scores of the different elements, showed the research-intensive universities generally scored highly on environment and outputs, but that some less research intensive institutions benefited from inclusion of impact profiles. It is also possible that in vocational disciplines there is some tension between doing research which is theoretically focused and achieves the highest star-ratings for outputs, and undertaking research which is more practically focused – but is more likely to provide impact. This was specifically referred to by one interviewee with considerable expertise in RAE/REF assessment in education:

“Given this is a practice based discipline there has been a strong argument that practice bases like nursing, social work and education should not be penalised for having practitioners because that is

15

Page 16: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

the very nature of the discipline ... in fact in education it’s been reduced audit after audit the actual size going through... the practitioners are writing one star papers that’s exactly what they should be doing showing to other practitioners how they are working and even a two star at an international level is acceptable.”

2.2 Selection Strategies Each HEI had a different strategy for size of submission, largely based on ‘grade-point average’, or anticipated star-rating. Participants in this research were able to describe the processes involved, although they were not always consistent:

“The general message was to be inclusive but to try and ensure work was gradeable. This was interpreted totally differently by each department. I was happy with this message because I could make everyone feel valued and supported and build a research culture for the future as I knew it had previously been a more select few and people hadn’t feel included.”

As the above statement illustrates, some HEIs took a decision to be more inclusive having experienced the divisiveness of the selection process in RAE 2008. There was a perceived trade-off which participants raised when asked about selection criteria:

“[there is a conflict between] inclusivity and performance / league tables … we carried out rather a lot of mathematical modelling about the likely financial consequences at setting the bar at different levels … the lower you set a bar the better financially at least in the short term you are likely to do ... but reputationally might not look so good and end up having problems attracting top class students and even very strong staff.”

The keystone for selection was, of course, output quality. Outputs that rated highly were those based on a growing volume of large-scale datasets and longitudinal cohort studies. The sub-panel found growing strength in research on HE, with the highest-rated being:

“…characterised by close theoretical engagement, a focus on contemporary social issues, and the ability to engage in comparative and international studies. Reflecting a policy priority and sustained funding, there was particularly strong sociological work on widening participation, using both qualitative and quantitative data. Weaker work tended to be focused on provision or student experience in particular universities and to lack analytical rigour. Higher education research remains an area with great potential.” (REF 2014, p105)

When asked whether HE research had been given the equal value as research in other sectors, some UoA co-ordinators were very clear that decisions had been solely taken on the question of merit, or ‘Grade Point Average’. Where this had been challenged, a second or third reviewer had been brought in to adjudicate. However, judging the quality of outputs prior to submission was perceived to be highly problematic. The use of external assessors (some with prior expertise in contributing to an RAE panel) were used in several institutions, in an effort to avoid claims of bias from internal reviewers:

“we had two rounds of external actually... we looked at people who had been on the panel for RAE but were not on the sub panel for REF... that didn’t give us many to choose from but then we chose

16

Page 17: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

someone who was a sort of generalist I suppose, was known to someone within our unit and someone whose judgement could be relied upon.”

Despite these efforts, external reviewers were not always felt to contribute substantially to the selection process, either because they did not have the specific expertise to assess papers, or simply that they gave very little feedback about the submitted outputs. The possibility that future rounds of research assessment might involve the need to submit all researchers was considered as a positive development by one respondent who noted:

“It doesn’t seem to me being any able to achieve anything at all except being very divisive within institutions, within departments and potentially leaving people in my position having to make very difficult decisions about individual members of staff and I think it would be much more straight forward and much fairer if you just got to enter everybody.”

2.3 Submission of HE-related outputsThe 5,519 outputs in the REF UoA25 submissions data were published in 1,025 different journals or other publications In order to provide a quick assessment of the proportion of HE outputs submitted, we searched for ‘higher education,’ ‘HE,’ or ‘university’ in the fields of title and volume title, resulting in an estimate of 502 publications in this sub-discipline in 122 different Journals. This search will have missed some outputs, but gives a minimum figure of 9% of the overall submitted outputs which were related to HE pedagogy.

The great majority of HE-related outputs were in sector-specific journals, with the highest number of outputs (64) being published in Studies in Higher Education which is regarded as the primary place to publish, followed by Higher Education (28), Assessment and Evaluation in Higher Education (25) and Teaching in Higher Education (20). From this analysis, it appears that only seven outputs with higher education or university in the title were published in the British Educational Research Journal, which was the most frequent outlet overall for UoA25 (with 155). Aside from the British Journal of Sociology of Education with ten higher-education related outputs, the Journal of Geography in Higher Education and Arts and Humanities in Higher Education were the most frequent discipline-specific HE journals with three outputs each.

Although not all submissions to Panel C organised their research environment narrative within named Research Groups, another indicator of the extent of HE-related submissions can be found in the names of those which were included in UoA25. Only 5 out of the 106 research groups named (fewer than 5%) explicitly contained ‘higher education’ or ‘HE’ in their name. This is a low number compared to the 41 research centres across the UK that participate in the HEA’s Network for Pedagogic Research and Research into Higher Education. There are political tensions in some institutions including those where HE pedagogy is recognised:

“Well without naming names because it is inappropriate but the XXX group see themselves as effectively leaders in the xxxx area of HE pedagogy, and I think they are without a shadow of a doubt and the feeling would be then should they keep that niche area and promote that and that should be the main thrust. [Others think it should be widened to all sectors], but if we do that we come into competition with the really big Educational researchers you know... so it’s an interesting dynamic”.

17

Page 18: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

“I think there are two components to the education submission, there is quite clearly the higher education side ... and there is the education department. In this University, politics play a big part, only the School of Education staff were returned, which is infuriating.”

There is a strong likelihood that pedagogic research might have been submitted through other sub-panels, especially in cognate areas such as Psychology and Sociology. However, interview respondents were, without exception, unaware of instances where this had occurred within their own institution – several indicating that they did not have time to attain a working knowledge of other areas of submission. Five UoA co-ordinators did note that they had selected pedagogic research undertaken in other disciplines, but that this had not always been easy to identify:

“Yes from education, business, psychology, health and computing –they were all doing interesting and relevant research and began to do seminars for our research centre and joined our events like speed dating etc .This really enriched our research culture but it was purely my trying to contact individuals and mentor them. It had never been done before there, and was pretty demanding!”

The same UoA co-ordinator spoke of the difficulty of identifying “hidden researchers into pedagogy who permeate the system but are not generally valued in their own departments … I took people from health, business, computing, psychology where their departments were not entering pedagogic research even though they could. They were much more exclusive than me.”

Although this respondent claims that other panels were more selective, it may be simply that the research was focused more on education than the home discipline. However, there is some evidence in the report from the Education sub panel (SP) that some of the submitted research was not of a high enough quality, being “frequently insufficiently theorised to make a contribution to knowledge and/or was low in rigour, with poor use of statistical data or inappropriately selective reporting of qualitative data.” (REF, 2014 p.107).

One UoA coordinator used this as an argument for their decision not to include HE research from across the institution (although they also hint that there may have been pragmatic organisational reasons for such a decision):

“We just entered staff who had good enough outputs simply in the main single submission; I was quite keen to keep everything simple and straight forward as it were ... on the grounds they were just very strong academic contributions to education.”

The issue of keeping the submission ‘simple’ or ‘safe’ came up on a number of occasions, and there is some evidence that UoA co-ordinators found pedagogic research harder to integrate into the narrative they were developing – especially if researchers were isolated and working on topics unrelated to the core of the education submission. In Schools of Education, it may be easier to make school-based or teacher-education research fit with the overall narrative. If, as is often the case, pedagogic researchers are working in relative isolation across a number of different departments, it may have been difficult to weave a coherent story around their diverse outputs.

Other reasons given for not including HE pedagogic research were that the time release from other duties and support or mentoring was simply not available for these researchers who worked outside of schools of education. One participant noted that their submission had been doubled by inclusion of pedagogic researchers outside of education, and that a tripling in the next submission would be possible if support

18

Page 19: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

could be made available. The lack of role models was seen as an important inhibiting factor in the development of pedagogic research outside schools of education. Finally, the issue of contracts was brought up by four of the respondents. Several spoke of the increasing use of ‘teaching-only contracts’ – but noted that these do not always preclude research activity:

“[Some staff are on] contracts that don’t have a requirement for research and very interestingly what’s happening is a lot of those staff are now working for their own doctorates and they are getting actively engaged in school based research ... they are building quite a different kind of research community, and something about having the pressure to produce outputs removed has actually opened up a different way of being an academic.”

“... What we are seeing already is .... the split off of research from teaching. You know we have increasing numbers of staff who will be moved onto teaching and learning contracts ... I think there is pressure on pedagogic research more generally to make it look like it’s REFable [but] there is a challenge to make it rigorous and theoretical enough.”

One UoA co-ordinator and two other respondents noted the difficulty encountered when staff who were not on contracts which included research, and were potentially REFable:

“There were some the uni could not submit as they were not on the right contracts and they wouldn’t adapt them - One person in charge of academic development who was a keen pedagogic researcher was so disappointed she left. A great shame and waste as her papers would have been good - she is now making a career as a free-lance researcher and loving every minute.”

Staff in central units, or on non-academic contracts found it particularly difficult to get their work taken seriously by the REF UoA co-ordinators in their institution. The hierarchical nature of many universities came into clear focus in relation to the REF submissions. Sitting outside an academic school – and often primarily in teaching and learning roles – these staff found it very difficult to be taken seriously as researchers:

“I feel that in a way my team is in a double bind of being seen as professional services administrative service as well as being focused on teaching, so in combination you’ve got the double bind of two lower status activities. As I always say, not only are we doing the academic housework, we’re polishing the glass ceiling as well.”

The same respondent said, “if you’re not part of the REF crowd you don’t get involved in the REF process, and to be honest, because I’ve had so much other stuff to do I haven’t busied myself with it too much.” As in the example above, this participant was on a non-academic contract, with potentially REFable outputs, but the university would not change the contract to enable inclusion. In another case, an eminent and experienced educational development professor's wide-ranging policy contributions were not included, because the outputs were considered to be 'grey publications'. This again indicates the inability of some UoA coordinators to value the impact of higher education outputs which lie outside their own field of expertise.

2.4 Development of Impact case studiesAs a new aspect of REF 2014, the need to demonstrate impact was considerable additional work for HEIs in preparing submissions. In total, 218 impact case studies were submitted to the UoA25 panel. According to the SP report:

19

Page 20: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

“Case study submissions in UoA 25 reflected a very wide range of types of impact, from policy to practice in all sectors. In most instances, the threshold of underpinning research quality was comfortably passed, but in a few cases the underpinning research was judged not to meet the quality threshold. The selection of case studies by submitting institutions appeared to be particularly influenced by the availability of evidence of the relationship between research and impact, so that some areas of activity where evidence may have been intangible were probably under-represented” (REF 2014, pp. 110-111)

This perspective was echoed by several of the UoA co-ordinators who felt that a lack of evidence of impact had in many cases hampered their efforts to put together impact case studies:

“We were pretty late off the stocks with that... there was a sort of mad panic when I was asked by the people who were managing the process what are your impact case studies likely to be ... there was stuff well I would say was having impact but we weren’t able to tell a convincing enough story... there wasn’t enough evidence.”

Despite some institutions using external companies to assist in writing case studies, the short timescale available meant that finding sufficiently strong evidence was a problem. Perhaps because of this difficulty, one of the UoA coordinators noted that the selection of staff for submission could be influenced by their contribution to the impact aspect of REF 2014 – even if their research outputs were not strong:

“We weren’t quite convinced that that person met the 2.5 GPA but because they were significant in the impact case study and also elsewhere in the submission as well in the environment we just sort of felt well it would be a bit silly... for personal professional reasons and also for the coherence of the story... not to include that person”

This evidence illustrates the difficulty of making judgements about an individual’s research quality from whether they were submitted to the REF or not.

In evaluating the scope of impact case studies for this study, a search for ‘university’, ‘higher education’ and ‘HE’ was made of the titles. This identified 17 out of the 216 available in the REF submissions database, or 8% of the total. This compares with the 9% of outputs identified by the method as relating to higher education, thus only indicating a marginal difference in representation. In further analysis through text searches and reading of the case studies, a total of 45, or 13.5%, claimed some impact on HE alongside other education sectors.

The selection of case studies involved a variety of arrangements. Generally the initial choice was made by the UoA co-ordinator, and then taken to a central institutional decision-making body for confirmation or filtering. Participants reported their experiences with selection of case studies, and how these were constrained by the need to demonstrate a clear route from research to impact:

“It was much easier to work with current staff in the university who could be supported to go back through emails and letters to their various contacts and networks etc., to contact people who would confirm their impact locally, nationally and across the world.”

Although difficulty had been experienced in identifying case studies that met eligibility criteria in some cases, all coordinators denied that their submissions had been limited by the number available:

20

Page 21: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

“We didn’t have many people who had sufficient research conducted at [HEI] which had had enough time to have clear impact. Once people who left had been ruled out there was not so much good material to form the basis of the study with people who could be consulted with etc . My own work (I was a recent arrival) was having impact, involved original research, but had been conducted in another institution so could not be used. The movement of staff around the HE system, the low number of very active researchers in education and the general quality of outputs were all factors in deciding on the three which were eventually submitted.”

“No, we included all eligible people we managed to find across the uni for publications and then worked on enough case studies to cover this number.“

However, the evidence from analysis of the submissions to REF2014 suggests that there were a much larger number of submissions which were just below the cut-off point for requiring an additional case study than would be expected by chance (analysis by Tim Horne reported in Jump, 2015a). One UoA co-ordinator did admit that the issue of case studies was in the back of their mind when they were selecting individuals and outputs – and it is very hard to gauge whether this might have affected others at a sub-conscious level:

“it’s a very difficult question to answer it’s almost like at the back of your mind all the time ... when you are thinking about outputs you knew the ratio was sort of ticking over in your mind... “

In several cases, this had involved the support of central university services or other professionals to ‘work up’ the case studies.

“It took a lot of academics’ time but also we had two professional members of staff who we already employed, they weren’t externals, brought in. Both of whom had long careers as professional journalists who were diverted from their normal work for about two and a half years to spend a proportion of their time on this activity... but I have seen some of the calculations for the cost of the REF, we never tried to quantify it internally but £5,000 sounds too low for an impact case study to me.”

It had also involved much additional time for the UoA coordinator or the person tasked with developing the case studies:

“I asked around all the current staff about who could do one now or someone who had left etc. This involved lots of exploratory work / phone calls - many of those who had left either weren’t contactable or didn’t want to be involved despite some chasing … this was a new process about which no-one seemed to be very knowledgeable.”

The need had generally been recognised for systems to be set up to capture impact from the beginning of the evaluation cycle, so that future iterations of a REF would prove easier:

“I think the main lesson we have learnt is and I guess everybody is saying is about the need to explicitly collect data about impact in a way we would never have dreamt of doing in the past.”

“… looking back on it we could have had a much sharper process in respect to that but we didn’t know what game we were playing.”

21

Page 22: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

There were also some reports from stakeholders (rather than UoA co-ordinators) about the impact of staff location in relation to where the submission was being developed on which case studies were ultimately submitted. So, for example:

“There are some people in health involved with nurse education or the professional doctorate who could have [impact case studies] submitted perhaps, had we known enough about them earlier.”

“We sit outside of the School of Education and I wasn’t part of the group putting together the submission .... so our case study wasn’t included”

“We weren't encouraged to submit anything from our department. We weren't on the radar, so to speak - pedagogic research doesn't really count!!”

Interestingly, in spite of specific guidance to the contrary, several of the case studies were clearly attributable to a single individual rather than a group’s work, but of course the panel’s view of these cases is unknown.

2.5 Type of impact claimed

As suggested by RCUK and by the Guidance on Submissions (REF 2011), pathways to impact vary widely and this was reflected in the submissions. Some examples of the areas of impact claimed for HE research in UoA25 are given in Table 6 (see Appendix 4 for fuller details of some of these examples). The routes to impact also varied, and those cited in some of the highest-rated submissions included:

Specialist/Advisory group/committee participation Involvement in developing new government policies Feed-in to guidelines/resources for agencies / professional bodies Journal article, paper or book Workshops / Conference dissemination Media

Policy claims about impact in this area focused strongly on key sector bodies such as HEFCE, QAA, OFFA, NUS, UUK, and ECU. Impacts on pedagogy were much harder to evidence, and frequently involved proxy measures such as impacts on academic development programmes, case studies included in HEA reports, and access data for Open Educational Resources (OERs).

Table 6 Example impact areas

22

Page 23: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Specific examples of evidencing impact on policy change in HE included submission of written or oral evidence to a government enquiry; research briefings for ministers; contribution to round table discussions with HEFCE and OFFA; and changes to Home Office policy. Changes to university policies (in named universities outside the home institution) were cited in some impact case studies. Whilst the links between the research and the policy change were not always clear, in some cases research findings were cited in key reports, making the pathway much more clearly identifiable. Changes to national guidelines on teaching and learning in HE were also cited, including a number of references to HEA reports or research syntheses. Where evidence of media coverage was presented, this was quite variable, from a few ad hoc reports to claims about work being featured ‘regularly’ in the press, or ‘over 40 articles on…’ Where workshops or conference presentations were cited, the stronger case studies had at least some evidence of changed behaviour as a result of the intervention (e.g. attendance of xx senior managers at ..., where 80% said that they would change their practice).

HEIs used many different means of corroborating or referencing their impacts, including:

University library catalogues Testimonies / Written statements Website visit statistics: page views, downloads Awards, e.g. MBE for Services to Higher Education Invited keynote speaker to national and international conferences / workshop facilitation /

consultancies / Invited expert

The extent of the use of emails or specifically elicited testimonials was surprisingly high. Altmetrics (such as number of downloads, number of times a web-page had been accessed and from which countries) were also used in places. As one participant with experience of advising several HEIs on their submissions acknowledged:

“Inevitably there is a sense of hierarchy [between different forms of impact claimed and evidence provided] and that this cannot always be objectively justified.”

23

1. Policy/legislation change – e.g. better-supported partnerships between colleges and universities (Sheffield)

2. Influence on planning or management of services – academically-relevant induction programmes to aid student retention (Edge Hill)

3. Practitioner debate informed or stimulated – e.g. threshold concepts framework employed across disciplines (Durham)

4. Awareness raising, challenging conventional wisdom – e.g. that courses should focus on subject area thinking and practising characteristics rather than on lists of intended learning outcomes (Edinburgh)

5. Input for lobbying or stakeholder groups – e.g. institution of National Student Survey (OU) 6. Curriculum change – e.g. guidance on integrative approaches to assessment and feedback

(Edinburgh)7. Changed practice for specific groups, e.g. teaching and learning in higher education (Cumbria)8. Influence on professional standards, guidelines or training – e.g. improved recruitment process

for specialty training in medical-related fields in the UK (Plymouth)

Page 24: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

2.6 Evaluating Impact

Evaluation of impact has been the focus of several commissioned reports (RAND 2015a, 2015b, KCL 2015) and the subject of much discussion in commentary on the REF (Smith et al. 2011, Jump 2015b, Khazragui and Hudson 2015), notably from the standpoint of tracing pathways from research to impact. In terms of pedagogic research, the issues were much the same as those reported more widely by the UoA co-ordinators, for example, the difficulty of finding evidence to support the claims:

“I suppose the flavour in the air was you would have to have something really good to be pedagogic that was really having an impact elsewhere and that you could prove it and... you know there were a number of us and I put myself in the frame who knew that... that our work is having an influence and sometimes internationally... yet... we couldn’t tell that story... it’s hard to dig down into.”

Nevertheless, there were questions raised about whether the UoA coordinators and managers making key decisions at the institutional level valued HE pedagogic research. Institution-level strategies were crucial since they could affect submission size and thus the number of case studies that could be submitted:

“… so perhaps what we wanted is something that is a very safe submission in terms of the quality of the outputs you know and that might be a smaller one.”

As another participant acknowledged, any peer review on such a scale as the REF inevitably means that individual assessors will look at work that falls outside their own specialist area of expertise, and the small number of high level pedagogic HE researchers in many institutions may have limited the internal review prospects for pedagogic research.

Interviewees in this project were largely in favour of assessing impact, recognising the need to justify the use of public funds, while recognising that Education research generally will not attract huge funding although it may make a real social difference. For example:

“I am never going to get much money for this, I am not even sure the top journals are going to want it, but this is going to really help affect the lives of people..”

Eligibility criteria (REF, 2012) were felt by some to have restricted the submission of case studies which had valid impact:

“There could have been one on my work on XX or yy had the original research not had to have been conducted in this institution. The people in business could certainly be submitted in the next REF as their research is starting to have considerable impact now, but not at the time of submission.”

The issue of the underpinning research being strong enough to support the claims of impact was also raised by some participants. The perception that pedagogic research is not (or should not be) REF-focused may lead staff to publish their work in less prestigious journals, placing the emphasis on sharing practice with the community, rather than contributing to education theory:

“A lot of what I saw was personally good kind of professional development work and necessary work but not REF work. .... I am not a pedagogy person but I would do work that wasn’t intended to

24

Page 25: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

be REFable and some of my colleagues do have a mix of stuff but there are a number of colleagues who are practitioners at heart and that is essentially who they are writing for.”

The potential focus of such work is often changing practice at a local university level, therefore neither the underlying research or the impacts are likely to fit within a REF submission. The need for more quality, and valued, research that addresses professional concerns and practices is a long-running issue as discussed by Furlong and Oancea (2005). However, in other cases, large scale impacts of pedagogic research were reported as being overlooked, in part because of the fact that impacts were on HEIs. This was felt to be an anomaly, especially in comparison with the scale of some School education research impacts. For example, research which impacts on 6-10 primary schools, affecting a total of 1-3,000 pupils could be included, whereas research which impacts on students in one HEI with 26,000 students in over 40 Schools or disciplines would be excluded. It was agreed by most interviewees that the rules should be amended for future iterations of the exercise.

Whilst such anomalies may be addressed in future iterations of research evaluation exercises, however, the nature of research evaluation and the rules of the REF 2014 specifically has affected the work of individual academics and HEIs. One participant indicated that it did make a very significant change to areas of research and work pursued, and could impact the direction of academic enquiry:

“I think the whole impact case studies thing is … tough and I think it’s harder on some kinds of research than others so I think it tends to privilege any education that has an overt policy kind of traction, media coverage and... a curriculum impact... interestingly we have come into the HE research area quite a bit and I think you know could we be a bit more assertive about that and say well what would be the characteristics of good research that wasn’t in those safe [traditional education] areas... yes so I think we could be in danger of having a slight backwash effect from what is a good impact study that you could easily evidence and that could have a constrained effect on the primary research that actually gets done … and before you know where you are you are being driven in a different direction where QR funding possibly gets directed towards stuff that will have an impact rather than stuff that is good knowledge production in its own right or has impact in ways that are hard to measure.“

2.7 The future Participants were clearly aware that the strategic nature of the REF process leads to certain game-playing. As institutions and UoA co-ordinators start to develop better understanding of the new process of impact assessment, these games are likely to become more sophisticated. Our interviewees already had some clear ideas about how the system could be subverted to serve their own interests. For example:

“You can play a game with that if you’ve got a member of staff such as those we were talking about earlier who are practitioners having high impact out there in the field of medicine, nursing, whatever erm... but not publishing all you have to do is put their name on one paper that is going to be submitted to the REF and then you can use it.”

There are also suggestions of ‘impact swaps’ and collaborations that would enable larger HE submissions than has been the case in the 2014 REF. The use of ‘altmetrics,’ or indicators for research assessment derived from the social web, such as from website ratings, microblogging, comments, etc. also has the

25

Page 26: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

potential to be a game-changer if these come into wider use. An independent review of the role of metrics in the REF (Wilsdon et al. 2015, p42), stated that:

“A range of altmetrics have been shown to correlate significantly and positively with bibliometric indicators for individual articles, giving evidence that, despite the uncontrolled nature of the social web, altmetrics may be related to scholarly activities in some way.”

However, the report continues (pp51-52):

“For some, case studies are the only viable route to assessing impact; they offer the potential to present complex information and warn against more focus on quantitative metrics for impact case studies. Others however see case studies as “fairy tales of influence” and argue for a more consistent toolkit of impact metrics that can be more easily compared across and between cases. … Overall, then, despite the considerable body of mostly positive empirical evidence reviewed above, although alternative metrics do seem to give indications of where research is having wider social impact they do not yet seem to be robust enough to be routinely used for evaluations in which it is in the interest of stakeholders to manipulate the results.”

In looking to the future when the size and shape of the return can be expected to change again there are thoughts about the scale of the return.

“We submitted xx FTE’s..., now I think that to go into the next REF, you know REF 2020, that needs to double its that scale of operation ... because the big players are the ones that are going to gain the big money and at that point I think you are in a very interesting area of discussion because there were [HE pedagogy] people out in the faculties that I thought... If I had been involved I would have made sure they were eligible but they don’t seem to have been submitted and I think that may well have been because we didn’t get the policy going early enough.”

If the submission size is to increase then the HE pedagogy researchers present an active area for growth. This research has shown how academics in UK HEIs are already adjusting their practices and thinking in response to these changes.

3 Discussion

This report presents the first empirical attempt to analyse the nature of the relationship between the REF and pedagogic research in the UK. Given the nature of the sample, and the time constraints, the research can only be illustrative rather than definitive. However, it does provide evidence about some issues encountered by pedagogic researchers, both in relation to submission of outputs and to impact case studies. The findings suggest – as reported elsewhere (e.g. KCL 2015, RAND 2015a, 2015b) - that involvement in REF 2014 was a huge commitment for institutions and individuals involved with the process. Education was a popular (and therefore competitive) unit of assessment, with half of all institutions submitting to this panel. The Russell group universities, those institutions with a member of staff on the sub-panel, and pre-1992 universities, performed better than other institutions, although whether there is a causal relationship (in either direction) is not clear.

The implications of REF – and in particular the impact case study rules – for pedagogic research and researchers are not clear-cut, and there were a range of different views as to whether pedagogic research was generally of high enough quality or suitable for REF entry as well as whether it was valued as highly as other sectors of educational research. In relation to the research questions raised at the start of this

26

Page 27: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

project, there are a number of key conclusions which can be drawn. There is evidence that only a small proportion of the work submitted to UoA25 in REF 2014 was about higher education. Impact case studies based on pedagogic research were in a small minority compared particularly to the school sector. However, the proportion of outputs submitted was broadly similar to the proportion of impact case studies, and most expressions of concern from stakeholders were over whether staff doing pedagogic research were discriminated against in terms of selection for the overall process rather than specifically for the impact case studies. It is clear however that UoA co-ordinators felt under-prepared for the impact case study preparation, and that evidence was not easily available, although this was an issue across the board (Manville et al., 2015).

The relatively low levels of HE pedagogic research in the REF submissions may indicate that there are simply few active pedagogic researchers; that the quality of pedagogic research is generally lower than school-based research; or that the structures which underpin the selection processes for REF discriminate against pedagogic researchers. Indeed, all may be true to a certain extent. However, our respondents raised specific concerns about a number of ways in which pedagogic research might be under-represented in the REF – and the implications that this under-representation might have on HE research more widely – and these are discussed further below. Their concerns focused on the following points:

• Credibility problems for pedagogic research with university and faculty colleagues

• The potential tension between research which will have impact on practice and high quality academic outputs

• Localised political issues around entering individuals from outside Schools of Education

• Contractual issues with some pedagogic researchers not on academic contracts

• The need to keep the submission ‘simple’ (coherent) and ‘safe’ (interpreted as school-based research or teacher education)

• The inhibiting effect of the HEFCE definition of eligible impacts

Credibility problems

Pedagogic research has historically had a credibility problem. As long ago as 2002, Jenkins described it as the ‘Cinderella’ of the teaching and learning world: “Though often patronised with words of encouragement, it has not really been recognised or valued by the 'ugly sisters' of the QAA and in particular the Research Assessment Exercise” (Jenkins, 2002, p.1). Although times have changed, and there is significantly more acceptance of pedagogic research, this view was still expressed by some of the respondents. The idea that pedagogic research simply was not and could not be ‘REFable’ was encountered in several instances, and one participant described staff who did pedagogic research as getting into a “deficit space”. This issue was complicated by the fact that there was a whole range of pedagogic research encountered, some of which was very small scale and localised and clearly with a practice focus. That pedagogic research falls on spectrum from large-scale empirical studies and theoretical contributions to the literature, to small scale practice-based research makes it relatively unusual - although not unique.

27

Page 28: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

The prevailing view of UoA co-ordinators was, unsurprisingly, that their selection process was entirely objective with quality being the deciding factor. Thus, the reasoning went, if pedagogic research was under-represented, it was solely because the quality was not high enough. Obviously, there may be some validity to this claim: there is certainly a large amount of small scale pedagogic research and scholarship taking place in HEIs which is simply not suitable for inclusion in REF. However, there is also evidence that other factors (such as coherence of submission, or type of contract) come into play which may have influenced the decisions of UoA co-ordinators or other stakeholders. Questions were also raised by some of our participants about whether UoA co-ordinators were qualified to make decisions about pedagogic research: if HE pedagogic research is not strongly represented in the schools of education from where most REF submissions were led, there is an increased likelihood of such outputs being under-valued.

Tension between outputs and impacts

Overall there was a mixed picture in terms of impact case studies and outputs. Clearly research needs to be rigorous, grounded in theory and relevant to have impact – and it seems plausible that outputs which are academically credible would have the most impact. However, where the higher status journals are looking for theoretical advances, and practitioners are more interested in strongly evaluated innovations, there was a tension identified between targeting research for output or impact. For education and other disciplines with a strong practice element such as health and medicine, there is a need to share innovations speedily and effectively with practitioners which may be in conflict with publication in slow-moving closed access high status journals.

In order to encourage panels to take pedagogic research more seriously across the board, changes to the case study rules would be advisable. This research has identified something of a tension which may exist in other vocational disciplines – between the need to produce theoretically-oriented outputs in order to achieve the level of academic excellence required for submission into the REF, and the need for practically-oriented research which is more likely to have impacts on educational communities and sectors. Another issue which may affect pedagogic research more than other areas is that there are fewer opportunities for large-scale longitudinal projects which might enable assessment of significant impact – although this was reported as being an issue for other sectors of education as well. Further research would be needed to explore the relationship between outputs and impact case studies.

Political issues surrounding selection

Decisions about selection of individual staff for submission were fraught, and although none of the interviewees reported the process as being subjective, some felt that decisions on inclusion were made for strategic rather than solely quality purposes. It is clear that since the last RAE, the institutional process has become more selective, and the ‘star rating’ or GPA required for submission has become more important. Gatekeepers for the REF submission could be hugely powerful in terms of decisions about which individuals were submitted, hence the use of external review in some cases. However, institutional decisions were also made which were out of the control of the UoA co-ordinators.

In terms of pedagogic research, the difficulties of both identifying and supporting staff outside of a school of education emerged as a theme. Since submissions were usually co-ordinated by Education staff, it required significant effort simply for them to find out about what other research was taking place elsewhere in the institution. Even then, processes such as mentoring and supporting staff who might be working in isolation in another faculty were challenging. Some institutions reported specific support and

28

Page 29: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

mentoring mechanisms for staff expecting to be submitted into the REF, but opening these up to individuals outside the School of Education (from where the UoA submission was managed) was only reported by one individual.

Contractual issues

Issues around staff contract type (research and teaching, teaching-only, or professional services) were raised by four of the respondents, and these seem set to become ever-more contentious with the increasing use of teaching-only or teaching and scholarship contracts (Locke, 2014). These contracts are often used to indicate staff who are not engaged in REF-level research. They are intended, in most cases, to allow flexibility of movement between the different contract types: thus, if an individual on a teaching-only contract is found to have four suitable REF outputs at the time of the next evaluation exercise, it is in theory possible for them to be changed onto a teaching and research contract. The data from our research participants suggest the need to ask questions about how likely this is to occur. Their experience (that contract changes were explicitly discouraged and that staff were discouraged from engaging in research, even where this would strengthen the REF submission) indicates a potential risk of creating a whole set of staff who are effectively disenfranchised from the REF process.

The underlying issue here remains the undervaluing of teaching in higher education in relation to research. Whilst the REF is seen as the major indicator of achievement in higher education, it will continue to be something which staff on all contract types aspire to be part of. Whether or not the proposed Teaching Excellence Framework (TEF) will achieve a similar outcome for teaching-related activities is hard to predict at the current time, but it is rather hard to envisage.

Coherence of submission

The element of higher education pedagogy was not always included and some perceived it to be (as described in the REF sub-panel’s report) local in effect and under-theorized. As a result some HEIs chose the strategy of a ‘safe’ submission, drawing on the ‘traditional’ educational research areas of teacher education, school-based or early years learning. Where HE pedagogy was included, it appeared to involve the UoA coordinator devoting extra effort in working across the university with academics in other departments. The organisational structures in HE often seemed to work against collaboration across disciplines or departments, and there was significant reliance on individuals championing pedagogic research if it was to be included.

Applied research seems have always been a challenging area in research evaluation. Barker (2007) identifies issues arising from the RAE exercise, such as a focus on “traditional academic disciplines”, and reports of discrimination against interdisciplinary research. However, she also states that: “… evidence for an actual retreat from such research is not easily found.” (Barker 2007, p.7) In a similar way, there is some evidence that UoA 25 co-ordinators were more likely to focus on research which was relatively closely connected to their own area – both geographically and in the discipline – which may have led to some HE pedagogic research being overlooked.

Inhibiting effect of ‘Impact’ definition

This project confirmed that, for the Education UoA, the impact case studies proved particularly problematic. Participants reported that individuals in their departments were not fully prepared for the requirements of REF 2014 and did not always have the evidence to enable case studies to be developed

29

Page 30: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

for some areas of work. Where there was some doubt over the acceptability of pedagogic research in general, this meant that impact case studies focusing on this area were unlikely to be selected. Although it is widely acknowledged that academic research needs to demonstrate its relevance to address societal needs (Gardner 2011), for HE pedagogy the rules which indicate that case studies affecting only the submitting HEI – however many people are impacted - were felt to be limiting and illogical by many of the interviewees.

There is also some evidence that UoA co-ordinators took the view that any impacts on HE were not eligible. For example, Table 7 outlines some case studies which respondents told us they would like to have submitted if the rules were different.

Table 7. Impact case studies which were not submitted to REF 2014

Broad research area Impact on

E-learning in Higher Education 5 different UK universities for curriculum development and evaluation

Assessment in Higher Education More than 30 HEIs in the UK and internationally through use of a research-based curriculum development tool

Reducing the attainment gap in higher education Development of a CPD framework for use with staff to investigate issues around the attainment of BME students. Rolled out across university of around 8000 students.

Sustainability in higher education Research fed into published book and training which has been used in 5 universities, by HEFCE, and by 5 companies in the UK.

The fact that several of these case studies could have been entered using existing rules only serves to underline the limiting effect of the perceived bias against HE pedagogic research. Reasons given for non-submission were related both to the rules of submission and to ‘political agendas and power relations within the Faculty leadership’ as also described by Holligan et al. (2011). In some cases, the failure to submit pedagogic research seemed less because of the actual rules of the process than by perceptions about the acceptability or value of pedagogic research. The failure to include an impact case study of

30

Page 31: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

probably one of the most influential HE researchers in the UK is indicative of the inability of some UoA coordinators to understand the impact of HE pedagogy outputs which go beyond the immediate unit.

Implications for pedagogic research more widely

The findings of this study raise some concerns about the longer-term sustainability of HE pedagogic research unless some changes to the REF system can be made to make it more open and accessible to pedagogic researchers. There is a risk, raised by some of our respondents, that pedagogic researchers will either change to a different area of research in order to enhance their prospects for career progression and promotion, or that many will simply be moved onto contracts in which either they are not permitted to undertake research, or in which research outputs and impact are not considered.

The strategizing or ‘game-playing’ involved in research assessment found in this research has been widely documented in literature as restricting the nature of work that academics commit to (Smith 2004, Tight 2008, Gilroy and McNamara 2009). This is described by Watermeyer (2014, p. 13):

“What then is required of academics, as intimated by Gray (2012), is greater sophistication in gamesmanship, tactics and guile in performing ‘impactfully’. In applying themselves in impactful ways, academics may appease HE funding authorities and galvanise the return of cherished academic tenets of intellectual autonomy, critical freedom and claims to self-sovereignty. It is in this context that the idea of impact in the REF – as more about how the impact story is told than the content of the story itself – is reinforced. … like any significant impact, which demands time with which to modulate and mature, understandings of the issues and obstacles that complicate and problematise the academic relationship with an impact agenda such as those identified herein, will only fully be known with the passing of many other REFs.”

As deliberation on future rounds of research evaluation continues, possibilities of changing the nature of the exercise have been raised, for example including a mainstreaming of such reporting into each HEI’s normal operating procedures (Jump 2015b, Wilsdon 2015). However, concerns remain that this will lead to over-reliance on metrics, and ‘metrics cannot replace humans’ (Smith, 2015). This project suggests that the system of qualitative peer review for REF submissions and impact case studies (within and beyond the individual universities) could be opened up, with some means of ensuring reasonable representation of each different sub-sector within each unit of assessment. Future research might explore these issues further to inform HEIs and individual academics on how HE pedagogy might be more widely represented in future rounds of the REF.

4. Recommendations

The outcomes of the research presented above lead to the following recommendations for consideration by the different stakeholder groups.

31

Page 32: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

For individual pedagogic researchers:

Pedagogic researchers whose home discipline is not education should engage in development of pedagogic research skills, and aim to publish in high level peer-reviewed journals as well as in practitioner publications; and

Consider collaborations with education researchers both within their own institution, and in other UK and international universities, to benefit from discipline knowledge and education theory;

All researchers can increase impact through giving keynotes or talks, by specifically engaging with decision-makers about recommended policy or practice changes which result from the research, and by tracking the impacts of their own research in a consistent manner;

Where appropriate, researchers should engage policy-makers locally or nationally in discussions about research findings – preferably before the research is published (or even before it begins), so that key stakeholders have an input into the research process and will potentially be more interested in the research findings;

Pedagogic researchers should consider partnering with colleagues in different universities to engage in ‘impact swaps’, and research collaborations to increase the likelihood of research recommendations being implemented beyond the institution of origin.

For higher education institutions:

Recognise that for 2020 REF the required critical mass may make HE pedagogic researchers crucial for Education UoA submissions, and develop a research strategy which includes supporting pedagogic researchers across the institution;

Search for the ‘hidden’ researchers who may be producing substantive papers in different departments of the university, in central units, or from non-traditional, teaching-only or professional services staff. Institutions should also remain open to the possibility of staff moving from one contract type to another if their output indicates that this would be appropriate;

Support and mentor interested staff at all levels to engage in and publish pedagogic research – and encourage them to take the time to develop research outputs in high quality journals. This may apply to staff who are not on traditional academic contracts as well as those who are;

Make plans to measure for impact much earlier in the research cycle. A recurring theme from the interviewees was the feeling of being unprepared for impact assessment. Since this is very likely to be a part of the next REF submission, institutions should be doing all they can to identify potential impact case studies and start collecting evidence of impact now;

Consider utilising ‘impact tracker’ software, or other ways of automating collection of altmetrics since these kinds of metrics may become increasingly common as ways of assessing impact;

HEIs could feed into developments for the proposed ‘Teaching Excellence Framework’ (TEF) some measures of engagement in, or impact of, pedagogic research, and lobby for revised guidance for REF2020 to ensure that HE pedagogic research is given due consideration.

For the Higher Education Academy (HEA):

The HEA is in a very strong position to raise the profile of HE pedagogic research nationally and within higher education institutions, so that it is regarded as an important and valued activity for staff on all types of contract;

32

Page 33: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

The HEA should advocate for revised guidance for REF2020 to ensure that HE pedagogic research and impact case studies are considered on an equal basis to research and impacts on other education sectors;

The HEA might want to feed into developments for the proposed ‘Teaching Excellence Framework’ (TEF) some measures of engagement in or impact of pedagogic research driving enhancement in the student experience;

The HEA should continue to organise and support pedagogic research activity in HE through events and networks such as the Pedagogic Research Network. Such activities should be open to staff on a range of different contracts including those from the College-based HE (CBHE) sector;

The HEA might consider running further ‘policy engagement events’ – bringing together academics and policy makers to discuss key issues, and underpinning research evidence.

For higher education policy makers:

Ensure that the make-up of the panel for UoA25 – and others as appropriate - in REF2020 includes representation of HE pedagogic researchers;

Provide explicit encouragement to institutions to include pedagogic research in REF submissions where this meets the standards for originality, significance and rigour;

Consider the implications of the REF (and TEF) process and funding arrangements for pedagogic research in HE. Pedagogic research sits on the boundary of teaching and research – yet there is a risk that it becomes under-valued by both measurement processes.

33

Page 34: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

5 References

Barker, K. (2007). The UK Research Assessment Exercise: the evolution of a national research evaluation system. Research Evaluation, 16(1): 3–12.

Biri, D. (2014). The impact agenda in REF 2014: an overview. UCL blog available at: https://www.ucl.ac.uk/steapp/steapp-news-publication/2013-14/biri

Bornmann, L. (2013). What Is Societal Impact of Research and How Can It Be Assessed? A Literature Survey. Journal of the American Society for Information Science and Technology 64 2 (2013): pp. 217-33.

Brown, P. (2013). ‘Education, opportunity and the prospects for social mobility’, British Journal of Sociology of Education 34(5-6): 678-700.

Cotton, D. and Kneale, P. (2014) Supporting Pedagogic Research at Plymouth: The Birth of an Institute. In: Building Staff Capacity for Pedagogic Research in Higher Education (edited by Lindsey McEwan and Kristine Mason O’Connor). SEDA Special 37.

Dean, A., Wykes, M. and Stevens, H. (eds.) (2013). 7 Essays on Impact, DESCRIBE Project Report for JISC, University of Exeter.

Frankl, M., Goddard, A. and Ransow, G. (2014). Golden triangle pulls ahead in REF shake-out, Research Fortnight 18 December.

Furlong, J., and Oancea, A. (2005). Assessing quality in applied and practice-based educational research: A framework for discussion, Review of Australian research in education: counterpoints on the quality and impact of educational research––a special issue of the Australian Educational Researcher , 6, 89-104.

Gardner, J. (2011). Educational research: What (a) to do about impact! British Educational Research Journal 37(4): 543-561.

Gilroy, P. and McNamara, O. (2009). A critical history of research assessment in the United Kingdom and its post‐1992 impact on education, Journal of Education for Teaching 35(4): 321-335.

Holligan, C., Wilson, M. and Humes, W. (2011) Research Cultures in English and Scottish University Education Departments: An Exploratory Study of Academic Staff Perceptions. British Educational Research Journal 37(4): 713-34.

Holmwood, J. and McKay, S. (2015). As REF 2014 goes by: a fight for cash and glory … (with apologies to Casablanca). The Sociological Review, 26 January.

Jenkins, A. (2002). Pedagogic Research at Brookes: Achievements, Opportunities and Questions. Teaching Forum 50 (Autumn).

Johnston, R. (2008) On Structuring Subjective Judgements: Originality, Significance and Rigour in RAE2008. Higher Education Quarterly 62(1 2): 120-47.‐

34

Page 35: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Jump, P. (2015a). Careers at risk after case studies ‘game playing’, REF study suggests. Times Higher Education, January 22nd 2015.

Jump, P. (2015b). The impact of impact, Times Higher Education Review, 19 February.

Khazragui, H. and Hudson, J. (2015). Measuring the benefits of university research: impact and the REF in the UK, Research Evaluation 24(1): 51-62.

KCL (Kings College London) (2015). The Nature, Scale and Beneficiaries of Research Impact: an initial analysis of REF 2014 impact case studies, March, London. Available at: http://www.hefce.ac.uk/pubs/rereports/Year/2015/analysisREFimpact/

Locke, W. (2014) Shifting academic careers: implications for enhancing professionalism in teaching and supporting learning. Report for the Higher education Academy (HEA). Available online at: https://www.heacademy.ac.uk/sites/default/files/resources/shifting_academic_careers_final.pdf

McGettigan, A. (2013). The Great University Gamble: money, markets and the future of higher education, London: Pluto Press.

McNay, I. (2015). Learning from the UK Research Excellence Framework: ends and means in research quality assessment, and the reliability of results in Education. Higher Education Review 47(3).

McNay, I. (2003) Assessing the Assessment: An Analysis of the Uk Research Assessment Exercise, 2001, and Its Outcomes, with Special Reference to Research in Education. Science and Public Policy 30(1): 47-54.

Manville, C., Morgan Jones, M., Frearson,M., Castle-Clarke, S., Henham, M., Gunashekar, S. and Grant, J. (2015) Preparing impact submissions for REF 2014: An evaluation. Cambridge: RAND Corporation. Available online at: http://www.rand.org/content/dam/rand/pubs/research_reports/RR700/RR727/RAND_RR727.pdf

Nicholson, C. (2015). REF 2014 admin costs were higher than predicted, Research Professional, 23 March, available at: http://www.researchresearch.com/index.php?option=com_news&template=rr_2col&view=article&articleId=1350863&utm_content=buffer3b17b&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer

Phillips, M. (2012). Research Universities and Research Assessment, Position Paper for the League of European Research Universities (LERU) available at http://www.leru.org/index.php/public/publications/category/position-papers/

RAND Europe (2015a). Preparing impact submissions for REF 2014: an evaluation. Available at: http://www.hefce.ac.uk/pubs/rereports/Year/2015/REFimpacteval/Title,103726,en.html

RAND Europe (2015b). Assessing impact submissions for REF 2014: an evaluation. available at: http://www.hefce.ac.uk/pubs/rereports/Year/2015/REFimpacteval/Title,103726,en.html

Rebora, G. and Turri, M. (2013). The UK and Italian research assessment exercises face to face, Research Policy 42(9): 1657-1666.

REF (2011). Guidance on Submissions , available at: http://www.ref.ac.uk /pubs/2011-02/

35

Page 36: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

REF (2012). Panel Criteria and Working Methods , available at: http://www.ref.ac.uk /pubs/2012-01/

REF (2014). Research Excellence Framework 2014: Overview report by Main Panel C and Sub-panels 16-26, available at: http://www.ref.ac.uk /panels/paneloverviewreports/

Rosenberg, G. (2015). Research Excellence Framework 2014: Manager’s Report, available at: http://www.ref.ac.uk

Sayer, D. (2014). Five reasons why the REF is not fit for purpose, The Guardian, 15 December,. Available at: http://www.theguardian.com/higher-education-network/2014/dec/15/research-excellence-framework-five-reasons-not-fit-for-purpose

Smith, A. (2015), Metrics review says data can’t replace humans in REF, Research Professional, 9 July, available at: http://www.researchresearch.com/index.php?option=com_news&template=rr_2col&view=article&articleId=1353420

Smith, S., Ward, V., and House, A. (2011). ‘Impact’ in the proposals for the UK's Research Excellence Framework: shifting the boundaries of academic autonomy, Research Policy 40(10): 1369-1379.

technopolis (2015). REF Accountability Review: costs, benefits and burden, July. Available at: http://www.hefce.ac.uk/pubs/rereports/Year/2015/refreviewcosts/Title,104406,en.html

Tight, M. (2004). Research into higher education: an a-theoretical community of practice?, Higher Education Research and Development 23(4): 395-411.

Tight, M. (2008). Higher education research as tribe, territory and/or community: A co-citation analysis, Higher Education 55(5): 593-605.

Watermeyer, R. (2014). Impact in the REF: issues and obstacles, Studies in Higher Education 1-16, doi:10.1080/03075079.2014.915303.

Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., Jones, R., Kain, R., Kerridge, S., Thelwall, M., Tinkler, J., Viney, I., Wouters, P., Hill, J., and Johnson, B. (2015). The Metric Tide: report of the independent review of the role of metrics in research assessment and management, July, doi: 10.13140/RG.2.1.4929.1363

Yorke, M. (2000) A Cloistered Virtue? Pedagogical Research and Policy in UK Higher Education. Higher Education Quarterly 54(2): 106-126

36

Page 37: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Appendix 1. Interview schedule

(a) Questions for Heads of School/ Unit of Assessment Co-ordinators Can you tell me about your role in the REF2014 submission for your university?

o When were you given the role? o Had you undertaken a similar role in RAE 2008? (If so, how did it differ?)o What is your own area of research (which sector?)

What was your university’s process for selection of staff to submit to the REF in UoA 25?o Did you have to read and evaluate the outputs?o Were other internal staff involved?o Were external staff involved?o Did you have a cut-off point in terms of GPA?

Did you submit HE pedagogic research to the REF in your UoA?o If so, were these staff from: a) the School of Education (or similar) or b) other

departments/ central units?o If not, why not?o Was any pedagogic research submitted to you which was not selected?o Are you aware of pedagogic research being submitted through any other UoA (if so,

which)?

How did you select the Impact case studies for REF 2014?o When did you start the process for choosing these?o What criteria did you use for selection?o Did you have any impact case studies that involved HE pedagogic research?o If so, were they ultimately submitted? (why/ why not?)o Could you give brief details of case studies which might have been included in the REF

under different circumstances? (e.g. change in rules such as exclusion of case studies with impact only on the submitting HEI.)

o Did you have any difficulty finding enough Impact case studies?o If so, did you submit fewer individuals/ outputs because of this?

In your view, was HE pedagogic research given the same value in the Education UoA for REF2014 as research in the FE, school or early years sectors?

o Why/ why not?

Is there anything else that you’d like to add which has not been covered by the questions above? Thank you for your time.

(b) Questions for stakeholders with an interest in pedagogic research

37

Page 38: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Can you tell me about your HE pedagogic research experience?o How long have you been engaged in pedagogic research?o Approximately how many publications in ped-res do you have?o Have you ever had your research submitted to the RAE/REF? (If yes, in which UoA)o Where are you based? (School of Education/ other department/ central unit)?o Does your institution have a pedagogic research centre or institute (or similar)?

Can you tell me about your experience of REF2014?o Did your university submit to the Education UoA? (If not, why not?) o Was pedagogic research submitted to any other UoA to the best of your knowledge? (If

yes, which?)o If you had outputs reviewed, can you tell me about the process? (Were internal or

external staff involved? What feedback did you get? Was there a cut-off point for GPA? What were your feelings about the experience?)

One of the major changes for REF2014 was the inclusion of impact case studies – What do you consider to be the key impacts of your own work? (On what or whom are the intended impacts?)

o Are you aware of any impact case studies which involved HE pedagogic research? (Even if these were not ultimately submitted to the REF.)

o What other impacts are you aware of from HE pedagogic research?o Do you have an example of a case study which might have been submitted if the REF rules

were different? (e.g. exclusion of case studies where impact was only on the submitting HEI)

o Has your research strategy changed as a result of the inclusion of impact case studies in the REF?

In your view, was HE pedagogic research given the same value in the Education UoA for REF2014 as research in the FE, school or early years sectors?

o Why/ why not?o Do the current REF rules impact on your research or dissemination practice? (If so, how?)

Is there anything else that you’d like to add which has not been covered by the questions above?

Thank you for your time.

38

Page 39: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Appendix 2. Profiles for UoA25 ordered by 4*

(HEIs with panel or sub-panel membership representation are shaded orange; HEIs in bold, italic font are Russell Group universities) For more detail see alphabetical list in Appendix 3

Institution name

FTE Category A staff submitt

ed 4* 3* 2*1*

unclassified

University of Oxford 39.22 65 27 8 0 0University of Nottingham 24.60 55 29 15 0 1University of Cambridge 34.20 54 24 20 2 0King's College London 36.30 54 35 10 1 0University of Durham 24.50 50 31 18 1 0University College London (IoE) 219.00 48 30 18 3 1Cardiff University 20.60 48 36 14 2 0University of Birmingham 23.70 47 35 15 3 0University of Bristol 34.50 44 36 19 1 0University of Exeter 25.92 44 40 16 0 0University of York 22.30 44 37 15 3 1University of Edinburgh 39.97 43 36 20 1 0Open University 54.26 38 31 26 5 0University of Sheffield 14.50 38 56 6 0 0University of Manchester 33.10 36 42 19 3 0Queen's University Belfast 22.80 35 52 11 2 0University of Warwick 34.57 32 44 20 4 0University of Glasgow 37.50 32 42 20 6 0University of Southampton 22.00 31 47 21 1 0University of Reading 15.60 29 47 23 1 0Loughborough University 7.70 28 57 15 0 0Lancaster University 11.00 25 53 20 2 0University of Sussex 16.10 25 48 24 3 0University of Ulster 12.20 24 36 32 8 0University of Leeds 32.10 23 52 23 2 0Manchester Metropolitan University 22.50 23 44 31 2 0University of Stirling 14.00 22 59 15 4 0Newcastle University 16.70 21 50 25 4 0Roehampton University 14.20 20 51 26 3 0University of the West of England 10.20 20 33 44 3 0

Stranmillis University College 5.00 20 24 2828 0

University College London excluding IoE 8.20 17 33 41 5 4

Staffordshire University 6.75 16 30 3024 0

University of Strathclyde 24.60 16 51 28 5 039

Page 40: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Goldsmiths' College 9.50 15 46 36 3 0

Liverpool Hope University 12.20 15 38 3413 0

London Metropolitan University 3.80 15 38 44 3 0University of East Anglia 13.20 14 63 18 2 3Brunel University London 14.20 13 47 33 7 0University of Huddersfield 11.75 13 41 40 6 0

University of Bedfordshire 8.40 12 25 4023 0

University of Leicester 24.10 12 49 33 6 0

University of Lincoln 6.60 12 30 4018 0

Sheffield Hallam University 12.40 12 57 21 9 1University of Winchester 7.13 12 35 42 9 2

Birmingham City University 7.50 11 28 4912 0

University of Chester 5.00 11 18 4225 4

Newman University 7.40 11 26 5310 0

Oxford Brookes University 10.50 11 26 4616 1

University of East London 8.40 10 47 3112 0

University of Hull 13.60 10 25 5113 1

Liverpool John Moores University 12.80 10 51 36 0 3University of Brighton 7.30 9 36 48 7 0

University of Derby 6.00 9 22 4719 3

Edge Hill University 11.70 9 27 4024 0

University of Plymouth 24.00 9 45 3511 0

University of Dundee 8.80 9 65 22 4 0University of Aberdeen 9.60 8 59 29 4 0

Glasgow Caledonian University 6.40 8 35 4017 0

St Mary's University, Twickenham 11.96 7 9 2348 13

University of Worcester 14.30 7 22 2146 4

Bishop Grosseteste University 6.00 6 22 3533 4

Nottingham Trent University 20.90 6 15 4528 6

Bath Spa University 13.20 5 30 4319 3

Canterbury Christ Church University 21.60 5 50 3213 0

University of Sunderland 21.10 5 11 3438 12

The University of West London 3.90 5 20 4132 2

40

Page 41: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

University of Wolverhampton 9.20 5 36 54 5 0

Leeds Beckett University 11.61 4 26 5118 1

University of the West of Scotland 14.00 4 41 4114 0

University of Greenwich 10.00 3 11 4927 10

York St John University 9.40 3 10 5326 8

University of Bolton 7.40 2 25 5413 6

University of Northampton 14.60 2 25 4231 0

Anglia Ruskin University 5.20 0 26 4526 3

University of Cumbria 4.72 0 23 5223 2

41

Page 42: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Appendix 3. Profile details

(Rows shaded orange are those universities which had representation in the REF process either on the Main Panel C or UoA25 panel. Bold-Italics are members of the Russell Group.)

Institution name Profile

FTE Category

A staff submitted

4* 3* 2* 1* UC

Anglia Ruskin University Outputs 5.20 0.0 28.6 42.8 23.8 4.8Anglia Ruskin University Impact 5.20 0.0 0.0 50.0 50.0 0.0Anglia Ruskin University Environment 5.20 0.0 50.0 50.0 0.0 0.0Anglia Ruskin University Overall 5.20 0 26 45 26 3Bath Spa University Outputs 13.20 8.2 32.6 36.8 18.3 4.1Bath Spa University Impact 13.20 0.0 40.0 60.0 0.0 0.0Bath Spa University Environment 13.20 0.0 0.0 50.0 50.0 0.0Bath Spa University Overall 13.20 5 30 43 19 3University of Bedfordshire Outputs 8.40 18.8 37.5 34.3 9.4 0.0University of Bedfordshire Impact 8.40 0.0 0.0 60.0 40.0 0.0University of Bedfordshire Environment 8.40 0.0 0.0 37.5 62.5 0.0University of Bedfordshire Overall 8.40 12 25 40 23 0University of Birmingham Outputs 23.70 38.2 38.3 19.6 3.9 0.0University of Birmingham Impact 23.70 36.7 50.0 13.3 0.0 0.0University of Birmingham Environment 23.70 100.0 0.0 0.0 0.0 0.0University of Birmingham Overall 23.70 47 35 15 3 0Birmingham City University Outputs 7.50 16.7 40.0 33.3 10.0 0.0Birmingham City University Impact 7.50 0.0 0.0 70.0 30.0 0.0Birmingham City University Environment 7.50 0.0 12.5 87.5 0.0 0.0Birmingham City University Overall 7.50 11 28 49 12 0Bishop Grosseteste University Outputs 6.00 9.5 33.4 38.1 19.0 0.0Bishop Grosseteste University Impact 6.00 0.0 0.0 40.0 50.0 10.0Bishop Grosseteste University Environment 6.00 0.0 0.0 12.5 75.0 12.5Bishop Grosseteste University Overall 6.00 6 22 35 33 4University of Bolton Outputs 7.40 2.9 22.8 48.6 17.1 8.6University of Bolton Impact 7.40 0.0 40.0 60.0 0.0 0.0University of Bolton Environment 7.40 0.0 12.5 75.0 12.5 0.0University of Bolton Overall 7.40 2 25 54 13 6University of Brighton Outputs 7.30 13.8 34.5 41.4 10.3 0.0University of Brighton Impact 7.30 0.0 50.0 50.0 0.0 0.0University of Brighton Environment 7.30 0.0 25.0 75.0 0.0 0.0University of Brighton Overall 7.30 9 36 48 7 0University of Bristol Outputs 34.50 29.9 39.6 29.2 0.7 0.6University of Bristol Impact 34.50 50.0 50.0 0.0 0.0 0.0University of Bristol Environment 34.50 100.0 0.0 0.0 0.0 0.0

42

Page 43: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

University of Bristol Overall 34.50 44 36 19 1 0Brunel University London Outputs 14.20 19.7 32.8 36.0 11.5 0.0Brunel University London Impact 14.20 0.0 90.0 10.0 0.0 0.0Brunel University London Environment 14.20 0.0 50.0 50.0 0.0 0.0Brunel University London Overall 14.20 13 47 33 7 0University of Cambridge Outputs 34.20 38.1 31.3 27.9 2.0 0.7University of Cambridge Impact 34.20 80.0 10.0 10.0 0.0 0.0University of Cambridge Environment 34.20 87.5 12.5 0.0 0.0 0.0University of Cambridge Overall 34.20 54 24 20 2 0Canterbury Christ Church University Outputs 21.60 7.1 37.8 35.7 19.4 0.0Canterbury Christ Church University Impact 21.60 0.0 63.3 36.7 0.0 0.0Canterbury Christ Church University Environment 21.60 0.0 87.5 12.5 0.0 0.0Canterbury Christ Church University Overall 21.60 5 50 32 13 0University of Chester Outputs 5.00 16.7 27.7 55.6 0.0 0.0University of Chester Impact 5.00 0.0 0.0 30.0 70.0 0.0University of Chester Environment 5.00 0.0 0.0 0.0 75.0 25.0University of Chester Overall 5.00 11 18 42 25 4University of Cumbria Outputs 4.72 0.0 29.4 64.7 5.9 0.0University of Cumbria Impact 4.72 0.0 20.0 50.0 30.0 0.0University of Cumbria Environment 4.72 0.0 0.0 0.0 87.5 12.5University of Cumbria Overall 4.72 0 23 52 23 2University of Derby Outputs 6.00 14.3 33.3 33.4 14.2 4.8University of Derby Impact 6.00 0.0 0.0 80.0 20.0 0.0University of Derby Environment 6.00 0.0 0.0 62.5 37.5 0.0University of Derby Overall 6.00 9 22 47 19 3University of Durham Outputs 24.50 26.0 44.8 27.1 2.1 0.0University of Durham Impact 24.50 100.0 0.0 0.0 0.0 0.0University of Durham Environment 24.50 87.5 12.5 0.0 0.0 0.0University of Durham Overall 24.50 50 31 18 1 0University of East Anglia Outputs 13.20 22.0 52.0 22.0 0.0 4.0University of East Anglia Impact 13.20 0.0 80.0 10.0 10.0 0.0University of East Anglia Environment 13.20 0.0 87.5 12.5 0.0 0.0University of East Anglia Overall 13.20 14 63 18 2 3University of East London Outputs 8.40 12.1 51.5 18.2 18.2 0.0University of East London Impact 8.40 10.0 50.0 40.0 0.0 0.0University of East London Environment 8.40 0.0 25.0 75.0 0.0 0.0University of East London Overall 8.40 10 47 31 12 0Edge Hill University Outputs 11.70 2.1 35.4 31.2 31.3 0.0Edge Hill University Impact 11.70 40.0 20.0 40.0 0.0 0.0Edge Hill University Environment 11.70 0.0 0.0 75.0 25.0 0.0Edge Hill University Overall 11.70 9 27 40 24 0University of Exeter Outputs 25.92 37.6 40.6 21.8 0.0 0.0University of Exeter Impact 25.92 30.0 60.0 10.0 0.0 0.0University of Exeter Environment 25.92 87.5 12.5 0.0 0.0 0.0University of Exeter Overall 25.92 44 40 16 0 0Goldsmiths' College Outputs 9.50 23.1 41.0 30.8 5.1 0.0

43

Page 44: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Goldsmiths' College Impact 9.50 0.0 50.0 50.0 0.0 0.0Goldsmiths' College Environment 9.50 0.0 62.5 37.5 0.0 0.0Goldsmiths' College Overall 9.50 15 46 36 3 0University of Greenwich Outputs 10.00 4.9 17.1 60.9 14.7 2.4University of Greenwich Impact 10.00 0.0 0.0 10.0 50.0 40.0University of Greenwich Environment 10.00 0.0 0.0 50.0 50.0 0.0University of Greenwich Overall 10.00 3 11 49 27 10University of Huddersfield Outputs 11.75 8.3 45.9 39.5 6.3 0.0University of Huddersfield Impact 11.75 40.0 0.0 50.0 10.0 0.0University of Huddersfield Environment 11.75 0.0 75.0 25.0 0.0 0.0University of Huddersfield Overall 11.75 13 41 40 6 0University of Hull Outputs 13.60 15.1 32.1 43.4 7.5 1.9University of Hull Impact 13.60 0.0 20.0 60.0 20.0 0.0University of Hull Environment 13.60 0.0 0.0 75.0 25.0 0.0University of Hull Overall 13.60 10 25 51 13 1King's College London Outputs 36.30 34.1 49.2 14.4 2.3 0.0King's College London Impact 36.30 92.0 8.0 0.0 0.0 0.0King's College London Environment 36.30 87.5 12.5 0.0 0.0 0.0King's College London Overall 36.30 54 35 10 1 0Lancaster University Outputs 11.00 24.4 51.2 22.0 2.4 0.0Lancaster University Impact 11.00 20.0 50.0 30.0 0.0 0.0Lancaster University Environment 11.00 37.5 62.5 0.0 0.0 0.0Lancaster University Overall 11.00 25 53 20 2 0University of Leeds Outputs 32.10 17.4 49.5 30.6 2.5 0.0University of Leeds Impact 32.10 60.0 30.0 10.0 0.0 0.0University of Leeds Environment 32.10 0.0 87.5 12.5 0.0 0.0University of Leeds Overall 32.10 23 52 23 2 0Leeds Beckett University Outputs 11.61 6.0 18.0 52.0 22.0 2.0Leeds Beckett University Impact 11.61 0.0 70.0 30.0 0.0 0.0Leeds Beckett University Environment 11.61 0.0 0.0 75.0 25.0 0.0Leeds Beckett University Overall 11.61 4 26 51 18 1University of Leicester Outputs 24.10 18.3 37.6 34.4 9.7 0.0University of Leicester Impact 24.10 0.0 66.7 33.3 0.0 0.0University of Leicester Environment 24.10 0.0 75.0 25.0 0.0 0.0University of Leicester Overall 24.10 12 49 33 6 0University of Lincoln Outputs 6.60 12.5 33.3 41.7 12.5 0.0University of Lincoln Impact 6.60 20.0 20.0 10.0 50.0 0.0University of Lincoln Environment 6.60 0.0 25.0 75.0 0.0 0.0University of Lincoln Overall 6.60 12 30 40 18 0Liverpool Hope University Outputs 12.20 11.4 48.6 28.6 11.4 0.0Liverpool Hope University Impact 12.20 40.0 0.0 30.0 30.0 0.0Liverpool Hope University Environment 12.20 0.0 37.5 62.5 0.0 0.0Liverpool Hope University Overall 12.20 15 38 34 13 0Liverpool John Moores University Outputs 12.80 16.0 54.0 26.0 0.0 4.0Liverpool John Moores University Impact 12.80 0.0 40.0 60.0 0.0 0.0Liverpool John Moores University Environment 12.80 0.0 50.0 50.0 0.0 0.0

44

Page 45: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Liverpool John Moores University Overall 12.80 10 51 36 0 3University College London Outputs 219.00 28.1 39.8 26.0 5.0 1.1University College London Impact 219.00 73.9 22.6 3.5 0.0 0.0University College London Environment 219.00 100.0 0.0 0.0 0.0 0.0University College London Overall 219.00 48 30 18 3 1University College London Outputs 8.20 13.8 27.6 44.8 6.9 6.9University College London Impact 8.20 40.0 40.0 20.0 0.0 0.0University College London Environment 8.20 0.0 50.0 50.0 0.0 0.0University College London Overall 8.20 17 33 41 5 4London Metropolitan University Outputs 3.80 10.5 31.6 52.6 5.3 0.0London Metropolitan University Impact 3.80 40.0 60.0 0.0 0.0 0.0London Metropolitan University Environment 3.80 0.0 37.5 62.5 0.0 0.0London Metropolitan University Overall 3.80 15 38 44 3 0Loughborough University Outputs 7.70 22.9 54.2 22.9 0.0 0.0Loughborough University Impact 7.70 30.0 70.0 0.0 0.0 0.0Loughborough University Environment 7.70 50.0 50.0 0.0 0.0 0.0Loughborough University Overall 7.70 28 57 15 0 0University of Manchester Outputs 33.10 31.3 35.1 28.3 5.3 0.0University of Manchester Impact 33.10 40.0 60.0 0.0 0.0 0.0University of Manchester Environment 33.10 50.0 50.0 0.0 0.0 0.0University of Manchester Overall 33.10 36 42 19 3 0Manchester Metropolitan University Outputs 22.50 20.5 35.2 40.9 3.4 0.0Manchester Metropolitan University Impact 22.50 36.7 63.3 0.0 0.0 0.0Manchester Metropolitan University Environment 22.50 12.5 62.5 25.0 0.0 0.0Manchester Metropolitan University Overall 22.50 23 44 31 2 0Newcastle University Outputs 16.70 21.7 33.4 39.1 5.8 0.0Newcastle University Impact 16.70 26.7 73.3 0.0 0.0 0.0Newcastle University Environment 16.70 12.5 87.5 0.0 0.0 0.0Newcastle University Overall 16.70 21 50 25 4 0Newman University Outputs 7.40 16.7 36.6 36.7 10.0 0.0Newman University Impact 7.40 0.0 0.0 90.0 10.0 0.0Newman University Environment 7.40 0.0 12.5 75.0 12.5 0.0Newman University Overall 7.40 11 26 53 10 0University of Northampton Outputs 14.60 3.8 7.5 43.4 45.3 0.0University of Northampton Impact 14.60 0.0 80.0 20.0 0.0 0.0University of Northampton Environment 14.60 0.0 25.0 62.5 12.5 0.0University of Northampton Overall 14.60 2 25 42 31 0University of Nottingham Outputs 24.60 31.2 44.1 22.5 1.1 1.1University of Nottingham Impact 24.60 100.0 0.0 0.0 0.0 0.0University of Nottingham Environment 24.60 100.0 0.0 0.0 0.0 0.0University of Nottingham Overall 24.60 55 29 15 0 1Nottingham Trent University Outputs 20.90 9.9 22.5 39.4 18.3 9.9Nottingham Trent University Impact 20.90 0.0 0.0 33.3 66.7 0.0Nottingham Trent University Environment 20.90 0.0 0.0 87.5 12.5 0.0Nottingham Trent University Overall 20.90 6 15 45 28 6Open University Outputs 54.26 16.8 41.6 33.2 8.4 0.0

45

Page 46: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Open University Impact 54.26 60.0 20.0 20.0 0.0 0.0Open University Environment 54.26 100.0 0.0 0.0 0.0 0.0Open University Overall 54.26 38 31 26 5 0University of Oxford Outputs 39.22 50.6 36.4 12.4 0.6 0.0University of Oxford Impact 39.22 84.0 16.0 0.0 0.0 0.0University of Oxford Environment 39.22 100.0 0.0 0.0 0.0 0.0University of Oxford Overall 39.22 65 27 8 0 0Oxford Brookes University Outputs 10.50 16.3 34.9 37.2 9.3 2.3Oxford Brookes University Impact 10.50 0.0 0.0 60.0 40.0 0.0Oxford Brookes University Environment 10.50 0.0 25.0 62.5 12.5 0.0Oxford Brookes University Overall 10.50 11 26 46 16 1University of Plymouth Outputs 24.00 13.4 35.4 34.1 17.1 0.0University of Plymouth Impact 24.00 0.0 63.3 36.7 0.0 0.0University of Plymouth Environment 24.00 0.0 62.5 37.5 0.0 0.0University of Plymouth Overall 24.00 9 45 35 11 0University of Reading Outputs 15.60 24.6 55.7 18.1 1.6 0.0University of Reading Impact 15.60 53.3 10.0 36.7 0.0 0.0University of Reading Environment 15.60 12.5 62.5 25.0 0.0 0.0University of Reading Overall 15.60 29 47 23 1 0Roehampton University Outputs 14.20 31.0 48.3 19.0 1.7 0.0Roehampton University Impact 14.20 0.0 60.0 40.0 0.0 0.0Roehampton University Environment 14.20 0.0 50.0 37.5 12.5 0.0Roehampton University Overall 14.20 20 51 26 3 0St Mary's University, Twickenham Outputs 11.96 10.3 13.8 20.7 41.4 13.8St Mary's University, Twickenham Impact 11.96 0.0 0.0 50.0 50.0 0.0St Mary's University, Twickenham Environment 11.96 0.0 0.0 0.0 75.0 25.0St Mary's University, Twickenham Overall 11.96 7 9 23 48 13University of Sheffield Outputs 14.50 22.6 68.0 9.4 0.0 0.0University of Sheffield Impact 14.50 100.0 0.0 0.0 0.0 0.0University of Sheffield Environment 14.50 25.0 75.0 0.0 0.0 0.0University of Sheffield Overall 14.50 38 56 6 0 0Sheffield Hallam University Outputs 12.40 15.4 55.8 25.0 1.9 1.9Sheffield Hallam University Impact 12.40 0.0 50.0 10.0 40.0 0.0Sheffield Hallam University Environment 12.40 12.5 75.0 12.5 0.0 0.0Sheffield Hallam University Overall 12.40 12 57 21 9 1University of Southampton Outputs 22.00 17.9 51.3 29.5 1.3 0.0University of Southampton Impact 22.00 76.7 23.3 0.0 0.0 0.0University of Southampton Environment 22.00 25.0 62.5 12.5 0.0 0.0University of Southampton Overall 22.00 31 47 21 1 0Staffordshire University Outputs 6.75 25.0 46.4 17.9 10.7 0.0Staffordshire University Impact 6.75 0.0 0.0 60.0 40.0 0.0Staffordshire University Environment 6.75 0.0 0.0 37.5 62.5 0.0Staffordshire University Overall 6.75 16 30 30 24 0University of Sunderland Outputs 21.10 0.0 16.5 32.9 40.5 10.1University of Sunderland Impact 21.10 26.7 0.0 36.6 10.0 26.7University of Sunderland Environment 21.10 0.0 0.0 37.5 62.5 0.0

46

Page 47: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

University of Sunderland Overall 21.10 5 11 34 38 12University of Sussex Outputs 16.10 24.6 40.3 29.8 5.3 0.0University of Sussex Impact 16.10 26.7 50.0 23.3 0.0 0.0University of Sussex Environment 16.10 25.0 75.0 0.0 0.0 0.0University of Sussex Overall 16.10 25 48 24 3 0University of Warwick Outputs 34.57 23.1 43.3 28.0 5.6 0.0University of Warwick Impact 34.57 40.0 50.0 10.0 0.0 0.0University of Warwick Environment 34.57 62.5 37.5 0.0 0.0 0.0University of Warwick Overall 34.57 32 44 20 4 0University of the West of England, Bristol Outputs 10.20 18.6 34.9 44.2 2.3 0.0University of the West of England, Bristol Impact 10.20 40.0 40.0 10.0 10.0 0.0University of the West of England, Bristol Environment 10.20 0.0 12.5 87.5 0.0 0.0University of the West of England, Bristol Overall 10.20 20 33 44 3 0The University of West London Outputs 3.90 7.7 30.8 53.8 7.7 0.0The University of West London Impact 3.90 0.0 0.0 20.0 80.0 0.0The University of West London Environment 3.90 0.0 0.0 12.5 75.0 12.5The University of West London Overall 3.90 5 20 41 32 2University of Winchester Outputs 7.13 18.9 40.6 27.0 10.8 2.7University of Winchester Impact 7.13 0.0 40.0 60.0 0.0 0.0

University of WinchesterEnvironment 7.13 0.0 0.0 87.5 12.5 0.0

University of Winchester Overall 7.13 12 35 42 9 2University of Wolverhampton Outputs 9.20 7.1 42.9 42.9 7.1 0.0University of Wolverhampton Impact 9.20 0.0 40.0 60.0 0.0 0.0

University of WolverhamptonEnvironment 9.20 0.0 0.0 100.0 0.0 0.0

University of Wolverhampton Overall 9.20 5 36 54 5 0University of Worcester Outputs 14.30 11.5 21.2 28.8 32.7 5.8University of Worcester Impact 14.30 0.0 40.0 10.0 50.0 0.0

University of WorcesterEnvironment 14.30 0.0 0.0 0.0 100.0 0.0

University of Worcester Overall 14.30 7 22 21 46 4University of York Outputs 22.30 27.6 43.7 21.8 4.6 2.3University of York Impact 22.30 73.3 26.7 0.0 0.0 0.0

University of YorkEnvironment 22.30 75.0 25.0 0.0 0.0 0.0

University of York Overall 22.30 44 37 15 3 1York St John University Outputs 9.40 5.0 15.0 57.5 22.5 0.0York St John University Impact 9.40 0.0 0.0 80.0 10.0 10.0

York St John UniversityEnvironment 9.40 0.0 0.0 0.0 62.5 37.5

York St John University Overall 9.40 3 10 53 26 8University of Aberdeen Outputs 9.60 6.1 48.4 39.4 6.1 0.0University of Aberdeen Impact 9.60 20.0 70.0 10.0 0.0 0.0

University of AberdeenEnvironment 9.60 0.0 87.5 12.5 0.0 0.0

University of Aberdeen Overall 9.60 8 59 29 4 0University of Dundee Outputs 8.80 13.5 67.6 13.5 5.4 0.0

47

Page 48: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

University of Dundee Impact 8.80 0.0 50.0 50.0 0.0 0.0

University of DundeeEnvironment 8.80 0.0 75.0 25.0 0.0 0.0

University of Dundee Overall 8.80 9 65 22 4 0University of Edinburgh Outputs 39.97 19.4 48.5 29.9 2.2 0.0University of Edinburgh Impact 39.97 76.0 24.0 0.0 0.0 0.0

University of EdinburghEnvironment 39.97 100.0 0.0 0.0 0.0 0.0

University of Edinburgh Overall 39.97 43 36 20 1 0University of Glasgow Outputs 37.50 17.1 43.4 30.9 7.9 0.7University of Glasgow Impact 37.50 40.0 60.0 0.0 0.0 0.0

University of GlasgowEnvironment 37.50 87.5 12.5 0.0 0.0 0.0

University of Glasgow Overall 37.50 32 42 20 6 0Glasgow Caledonian University Outputs 6.40 13.0 34.8 34.8 17.4 0.0Glasgow Caledonian University Impact 6.40 0.0 60.0 40.0 0.0 0.0

Glasgow Caledonian UniversityEnvironment 6.40 0.0 0.0 62.5 37.5 0.0

Glasgow Caledonian University Overall 6.40 8 35 40 17 0University of Stirling Outputs 14.00 25.0 45.8 22.9 6.3 0.0University of Stirling Impact 14.00 20.0 80.0 0.0 0.0 0.0

University of StirlingEnvironment 14.00 12.5 87.5 0.0 0.0 0.0

University of Stirling Overall 14.00 22 59 15 4 0University of Strathclyde Outputs 24.60 9.8 47.8 34.8 7.6 0.0University of Strathclyde Impact 24.60 40.0 50.0 10.0 0.0 0.0

University of StrathclydeEnvironment 24.60 12.5 62.5 25.0 0.0 0.0

University of Strathclyde Overall 24.60 16 51 28 5 0University of the West of Scotland Outputs 14.00 6.4 44.7 44.6 4.3 0.0

University of the West of Scotland Impact 14.00 0.0 60.0 20.0 20.0 0.0

University of the West of Scotland

Environment 14.00 0.0 0.0 50.0 50.0 0.0

University of the West of Scotland Overall 14.00 4 41 41 14 0

Cardiff University Outputs 20.60 32.1 43.5 20.6 3.8 0.0Cardiff University Impact 20.60 60.0 40.0 0.0 0.0 0.0

Cardiff UniversityEnvironment 20.60 100.0 0.0 0.0 0.0 0.0

Cardiff University Overall 20.60 48 36 14 2 0Queen's University Belfast Outputs 22.80 25.3 54.2 16.9 3.6 0.0Queen's University Belfast Impact 22.80 46.7 53.3 0.0 0.0 0.0

Queen's University BelfastEnvironment 22.80 62.5 37.5 0.0 0.0 0.0

Queen's University Belfast Overall 22.80 35 52 11 2 0Stranmillis University College Outputs 5.00 30.0 25.0 35.0 10.0 0.0

48

Page 49: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Stranmillis University College Impact 5.00 0.0 40.0 0.0 60.0 0.0

Stranmillis University CollegeEnvironment 5.00 0.0 0.0 37.5 62.5 0.0

Stranmillis University College Overall 5.00 20 24 28 28 0University of Ulster Outputs 12.20 12.2 43.9 31.7 12.2 0.0University of Ulster Impact 12.20 80.0 20.0 0.0 0.0 0.0

University of UlsterEnvironment 12.20 0.0 25.0 75.0 0.0 0.0

University of Ulster Overall 12.20 24 36 32 8 0

49

Page 50: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Appendix 4. Example impact case studies from higher education

Durham University: Changing educational practice through ‘Threshold Concepts’

REF UoA25 impact profile: for 3 case studies and impact template

FTE Category A staff submitted4* 3* 2* 1* unclassified

24.50 100.0 0.0 0.0 0.0 0.0

Summary of impact

A Threshold Concept is a new way of thinking, where a specific element of a curriculum that is difficult for students to understand, such as ‘opportunity cost’ in Economics or ‘stress transformation’ in Engineering, irreversibly restructures the learner’s understanding once it is grasped. Consequently, by identifying Threshold Concepts, and then by adapting teaching practice and the focus of assessment, educators can significantly benefit students’ progress. Durham’s conceptualisation of Threshold Concepts has had a transformative effect on educational practice, curriculum design and assessment, particularly in Higher Education (HE), but also on schools, on educational policy and on conceptions of work-based learning and games design in international companies such as Nokia. The concept and its application have impacted on professional practice in HE institutions in at least 30 countries. In the UK, Threshold Concepts have been adopted by a number of high profile educational agencies and organisations and are now embedded in the policy and practice of many institutions. Underpinning research

Threshold Concepts were first introduced by Professor Jan Meyer in 2002. The idea was first formally recognized in a Progress Report (p. 3) of a research project called ‘Enhancing Teaching-Learning Environments in Undergraduate Courses’ (ETL1) funded through the Teaching and Learning Research Programme (TLRP) by the ESRC from 2001 to 2004.

This work therefore sets the foundations of a framework from which pedagogical experts across disciplines can identify Threshold Concepts within their specific areas.

Threshold Concepts as an innovation have significantly changed the nature of curriculum design and assessment, and have enhanced effective practice, with direct impact on teaching practices and on students’ learning.

Research references

2 book chapters, 2 papers from Oxford Centre for Staff and Learning Development, 1 paper in Higher Education [2003-2010]

Details of Impact

Changing practice within HE: covering each of the 19 JACS codes, across 32 countries and extended to schools, further education, policy-making and industry.

Impact on curriculum design courses and funding: over 40 disciplinary workshops facilitated by the HE Academy; JISC ‘infokit’; SEDA materials for PGCert in HE courses; significant national government funding for projects in NZ, Australia and US.

Impact on practitioners’ reflective practice: reviewing pedagogical strategies, evidenced in social media interest, special issue of Journal of Faculty Development, on-line blogs, project workshop

Wider impact in education: uptake in FE, Foundation courses and schools, including in Korea

50

Page 51: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Impact on industry: used by companies such as Nokia, Atos and Virtech in staff training through 9.4 million EU project involving six companies in five different countries.

Sources to corroborate the impact

Website dedicated to Threshold Concepts, with examples, links, these and dissertions. Links to other initiatives referred to in Section 4, e.g. the USA’s National Science Foundation (NSF) awards 2009-20011.

51

Page 52: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

University of Edinburgh: Enhancing learning, teaching and assessment at university

REF UoA25 impact profile (for 5 case studies and impact template)

FTE Category A staff submitted 4* 3* 2* 1* unclassified

39.97 76.0 24.0 0.0 0.0 0.0

Summary of impact

Outcomes of the research conducted at the University of Edinburgh (2001 to 2007) that have had the most far-reaching impact are a strong conceptualisation of the whole learning environment (including curricula, teaching, learning support, and assessment and feedback) and its influence on the quality of undergraduates’ learning. What gave these outcomes added resonance was a concern for disciplinary distinctiveness as well as more generic features; an alertness to the pervasive implications for day-to-day teaching-learning practices of mass 21st-century higher education; and a focus on enhancing and evaluating the student experience. The reach of the impact extends to university teachers, middle and senior academic managers, local and national bodies with responsibilities for surveying quality and standards and, albeit less directly, students. Staff in at least 21 universities in 12 countries have used the Experiences of Teaching and Learning Questionnaire (ETLQ). The National Student Survey questionnaire was influenced by the ETLQ, and has continuing UK-wide impact on teaching through students’ retrospective ratings of their experience. Project outputs were directed towards teaching staff through workshops, publications and invited presentations, followed by detailed advice on assessment and feedback of coursework.

Underpinning research

Two main research projects underpin the impact described here:

• ESRC/TLRP Award (£847,560, 2001-2005), evaluated as Very Good/Excellent. Enhancing Teaching-Learning Environments in Undergraduate Courses. Project involved collecting students’ descriptions of their approaches to learning and the teaching and learning environment they had experienced, developed as the Experiences of Teaching and Learning Questionnaire (ETLQ). Administered to over 6500 students in 11 British universities, involving 90 university teachers. Collaboration with Durham and Coventry. Findings suggested that courses should focus on developing the distinctive ways of thinking and practising characteristics of the subject area concerned rather than on lists of intended learning outcomes.

• Higher Education Academy (£30,000, 2006-2007), Innovative Assessment Across the Disciplines Research references

One book, 1 book chapter, ESRC and HEA final project reports, 1 paper in Higher Education Research and Development [2003-2009]

Details of impact

Generic and discipline-specific dissemination; involving interaction through institutional, national and international forums.

National Student Survey input: and ETLQ used in 10 countries Book and invited presentations to more than 50 conferences/workshops in 7 countries and to

Australian Learning and Teaching Council, the Australian Technology Universities Network, and Universitas21

Guides on integrative approaches to assessment and feedback for the QAA Scotland disseminated to staff in all Scottish universities.

52

Page 53: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Sources to corroborate the impact

Weblinks to archived webpages to corroborate impact, including the resources, presentations and reports on use of ETLQ, QAA reports and Enhancing Feedback website.

53

Page 54: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Open University: The impact of the National Student Survey: changing the behaviour of institutions, teachers and studentsREF UoA25 impact profile (for 6 case studies and impact template)

FTE Category A staff submitted 4* 3* 2* 1* unclassified

54.26 60.0 20.0 20.0 0.0 0.0

Summary of impact

Open University (OU) researchers were responsible for the development of the National Student Survey (NSS). It is an influential and widely cited source of information about the experience of students in higher education. Around 287,000 students at more than 300 institutions responded to the 2012 NSS. It has been incorporated into the league tables published annually by The Times, Sunday Times, Guardian and the online Complete University Guide. Performance in the survey has led institutions to take actions and initiatives to improve the student experience. The Ramsden report for the Higher Education Funding Council for England (HEFCE) indicates it has become an important element in quality enhancement.

Underpinning research

• HEFCE-commissioned research Collecting and using student feedback on quality and standards of learning and teaching in HE in collaboration with SQW Limited and members of the NOP (National Opinion Polls) Research Group. Literature review, outputs, resultant further commissioning of pilot and feasibility studies which led to proceeding with the annual NSS from2005.

• Later work in collaboration with Hobsons plc demonstrated that league tables which incorporate the NSS data in their ranking methodologies had a major impact on institutions’ strategic planning

Research references

Two HEFCE reports, 3 papers, in Open Learning, Assessment and Evaluation in Higher Education, and Studies in Higher Education. Listing of research funding by HEFCE.

Details of Impact

Student responses to the survey: 287,000 in 2012. HEFCE acknowledgement of the central role of the research team in the introduction of the NSS

which ‘has had an impact in two main areas; student choice and institutional behaviour’. Affected quality assurance processes and institutional quality enhancement activities. Australian Course Experience Questionnaire influenced by the research team report, seen in its

similarity to the NSS Use of NSS results by the HEA Published accounts of the use of the NSS to implement initiatives aimed at enhancing the

student experience at four universities, and links to another’s strategic plan. Potential future impact: introduction at postgraduate level.

Sources to corroborate the impact

Five reports (HEFCE/HEA), Two papers, in Quality Assurance in Education and Active Learning in Education, one contact for corroboration.

54

Page 55: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Plymouth University: Selection of doctors to specialty training on the basis of aptitude

REF UoA25 impact profile (for 3 case studies and impact template)

FTE Category A staff submitted 4* 3* 2* 1* unclassified

24.00 0.0 63.3 36.7 0.0 0.0

Summary of impact

This case study demonstrates how research conducted at Plymouth University on the recruitment and selection methodologies for postgraduate speciality training in medicine has impacted on the development and implementation of the recruitment process for core training and specialty training posts in medical related fields throughout the UK. The impacts take the form of a new and improved shortlisting methodology and model for selection centre recruitment at a national level. It overcomes the problems revealed with recruitment to medical training during the introduction of the Modernising Medical Careers (MMC) initiative and the Medical Training Application Service (MTAS) debacle in 2007-08.

Underpinning research

NHS-funded pilot to evaluate evidence-based approaches to selection and education-skills assessment, leading to the situational judgement test (SJT) and a machine-marked clinical problem solving (CPS) test.Later used in national evaluation as part of the Academy of Medical Royal Colleges selection pilot for recruitment to core training posts across multiple specialities confirmed the SJT was fair with high reliability as desirable for high stakes assessment, with consistently favourable feedback from both applicants and assessors.Research references

One chapter in book, five papers, in Anaesthesia (2), Clinical Medicine, British Journal of Anaesthesia, and Royal College of Anaesthetists Bulletin.Details of impact

Implementation of an improved recruitment process for core training and specialty training posts in medical-related fields in the UK, for example by the Royal College of Anaesthetists

Being taken forward by the UK Postgraduate Medical and Dental Selection and Recruitment Project Board

Central to a new national model for recruitment to anaesthesia Training accessed by 539 assessors across the UK

Sources to corroborate the impact

Reports to Department of Health, Project Report, RCoA Bulletin, two specific sources to corroborate (Recruitment Lead, Anaesthetics National Recruitment Office, and President, Royal College of Anaesthetists.

55

Page 56: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

University of Sheffield: Developing Higher Education in Further Education colleges

REF UoA25 impact profile (for 3 case studies and impact template)

FTE Category A staff submitted 4* 3* 2* 1* unclassified

14.50 100.0 0.0 0.0 0.0 0.0

Summary of impact

A twelve-year programme of research (2001-12) led by Professor Gareth Parry on higher education in further education colleges has produced impacts on policy development, institutional strategy and professional practice in England. The beneficiaries are the central authorities for higher and further education, the colleges of further education and their university partners, college managers and tutors, and thereby students and employers. The types of impact are changes to national funding and reporting arrangements; enhancements to policy and organisational learning; and contributions to institutional capacity-building. The vehicles for achieving impact are collaborations with policy, professional and practitioner communities through expert programmes, consultancies, databases, directories and guides to good practice. The reach of the impact is national, cross-sector and institutional, with a wider influence on debates across the UK and international developments including in Australia.

Underpinning research

• The first research to chart and analyse the contemporary contribution of further education colleges to English higher education and contending that the structure of a two-sector system of further and higher education exercise a decisive, often contradictory, influence on efforts to build a larger role for colleges in higher education, as has been a goal of successive governments since 1997.

• Five grants from national agencies: Learning and Skills Development Agency, HEFCE (2), ESRC and BIS.

Research references

LSDA, HEFCE and BIS papers, and two journal papers in Higher Education Quarterly [2002-2012]

Details of Impact

Contributed to changes in policy, implementation, assessments of progress and thinking of national bodies. Resulting in improved choices for students and better-supported partnerships between colleges and universities. Policy learning through research briefings, expert consultancies, membership of key policy and advisory groups, invitations to contribute to national inquiries.

Adoption of a minimum period of three years of security for the funding and student numbers available to them, enabling more opportunity for long-term strategic investment in higher education. Strengthened capacity of colleges to develop, manage and monitor strategies for higher education; 176 colleges represented at five regional seminars. Follow-on projects by HEFCE led to higher education strategies from 240 out of 2156 eligible colleges

Staff development and good practice guide, with 2173 copies distributed. Researchers invited to undertake equivalent exercise by Australian government

Sources to corroborate the impact

56

Page 57: Acknowledgements - University of Plymouth€¦  · Web viewThe report also notes that 47 submissions were counted as ... nationally and across the world.” ... KCL 2015) and the

Head of funding, HEFCE, Head of Qualifications at the Skills Funding Agency, Policy Exchange report, LSIS guide, Director of Policy and Stakeholder Engagement, TAFE Directors Australia, Assistant Chief Executive at AoC.

57