shaping the community scorecard action research project final report

38
Shaping the Community Scorecard Action Research Project Final Report

Upload: others

Post on 13-Mar-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

Shaping the Community Scorecard Action Research ProjectFinal Report

2 Shaping the Community Scorecard Action Research Project

Foreword

The UK Commission’s 2009 report, ‘Towards Ambition 2020: skills, jobs, growth’1, proposed that informed customers should increasingly drive supply, performance and quality, and that securing this shift would involve a transfer of trust and greater autonomy for providers. The report argued that information should be provided to inform learner choice, but that a wider range of information and measures should be developed to reflect providers’ responsiveness to their local communities.

LSIS has a long-standing interest in and experience of supporting and encouraging colleges and providers to play a strategic role in supporting their local communities and so were keen to work with the UK Commission on the exploration of a ‘community scorecard’ to strengthen and demonstrate providers’ contribution and accountability to their communities.

The work was timely and relevant in the context of the ‘Big Society’ – founded on the high-level principles of freedom, fairness and responsibility – resonating with the intention to reorientate the accountability of public services away from government and towards customers and citizens. In this context, the National Improvement Partnership Board (NIPB) was charged with the development of early proposals for a new framework of public information for the sector, and agreed to take an overview of the ‘community scorecard’ project as well.

Eleven providers took part in the action research and put a considerable amount of time and effort into exploring what a ‘community scorecard’ might look like for their institutions. The group reflected the diversity in the range and mix of institutional and administrative arrangements in different localities, and the different socio-economic profiles of the communities the sector serves. It also deliberately included providers at different stages of development in their thinking on local accountability to gain a realistic and wide perspective on developmental needs. The diversity has resulted in a wide-ranging and rich report that outlines areas of good practice but also some of the challenges experienced.

We hope that the learning shared in this report may help to inform colleagues in the sector who are thinking about how to articulate as fully and transparently as possible their responsiveness and contribution to their communities within new frameworks of accountability. In LSIS, we are also very keen to understand what more we could do to support the sector to develop its thinking in this important new area of work.

Caroline Mager Executive Director Policy, Research and Communications, LSIS

Ian Kinder Deputy Director UK Commission for Employment and Skills

1 UK Commission for Employment and Skills report, ‘Towards Ambition 2020: skills, jobs, growth’www.ukces.org.uk/assets/bispartners/ukces/docs/publications/towards-ambition-2020-skills-jobs- growth.pdf

In spring 2010, the UK Commission for Employment and Skills and LSIS came together to explore – through a series of sector-led action research projects – the implications for colleges and providers of the UK Commission’s proposal for ‘community scorecards’ to be part of an emerging suite of public information to inform choice.

3Shaping the Community ScorecardAction Research Project

Contact details

Enquiries about this report should be addressed, in the first instance, to:

Jenny Williams Head of Policy LSIS Email: [email protected] or

Alison Morris Programme Manager UK Commission for Employment and Skills Email: [email protected]

4 Shaping the Community Scorecard Action Research Project

Contents

Foreword 02 Contents 04 Executive summary 05 1. Introduction 10 2. Policy context 12 3. Methodology 15 4. Research conclusions 18 5. Emerging implications and six proposals for policy development and implementation 28

5Shaping the Community ScorecardAction Research Project

Executive summary

LSIS has an interest in and experience of supporting and encouraging colleges and providers to play a strategic role in supporting their local communities, and agreed to work with on the development of a ‘community scorecard’ to strengthen and demonstrate providers’ contribution and accountability to their communities.

At the time the project began in summer 2010, the concept of the ‘Big Society’ was emerging as an overarching framework for the Coalition Government’s high-level principles of freedom, fairness and responsibility. The concept of a ‘community scorecard’ seemed to resonate with the intention to reorientate the accountability of public services away from government and towards customers and citizens.

The approach adopted was to involve the sector in shaping the development of the community scorecard from the beginning and willing providers were invited – from FE colleges, the work-based and adult and community learning sectors – to join a small action research group. The group reflected the diversity in the range and mix of institutional and administrative arrangements in different localities and the different socio-economic profiles of the communities the sector serves. It also deliberately included providers at different stages of development in their thinking about the community scorecard to gain a realistic and wide perspective on developmental needs. Each provider received a small grant towards their project and all were supported by RCU Research and Consultancy2, who coordinated the research project.

The diversity of the sector, reflected in the action research group, underlined the necessity to avoid any prescriptive design for the community scorecard. Rather, the idea was to provide a spectrum of opportunities from which providers could select the most appropriate mix for their specific context.

The project ran between September and December 2010, in parallel with the emergence of the new government’s approach to public service reform and its implications for FE and skills. These have included a stronger emphasis on the sector developing a more comprehensive range of public information through the National Improvement Partnership Board (NIPB), a focus on localism, and in the context of greater freedoms and flexibilities, a strong steer to the sector to develop its responsiveness to its communities and its networks of partners. During this period a set of national indicators and measures has also emerged, creating new drivers from the centre of the system.

Some of the opportunities and tensions between these various developments are reflected in the challenges the providers faced in establishing their community scorecards in a rapidly evolving context. In particular, the determination of the Coalition Government to devolve greater responsibility to frontline providers and

In ‘Towards Ambition 2020: skills, jobs, growth’ (October 2009), the UK Commission made a series of proposals for “building a more strategic, agile and labour market led employment and skills system”. At the heart of these proposals was an intention to “increase trust in, and authority to, learning providers”. Providing more public information on the performance of providers, and increasing their accountability to their communities was seen as good ways to do this.

2 http://www.rcu.co.uk/

6 Shaping the Community Scorecard Action Research Project

to emphasise the need for accountability to their communities and localities became apparent late in the action research. The action research was not, therefore, able to explore fully the implications of this shift.

This report outlines the experiences of the eleven providers involved in the action research and identifies some overall research conclusions from the project. It also considers the emerging implications of these conclusions for future policy development and implementation and makes six proposals. Research conclusions The intention of the action research project was to give a group of providers the space to explore what information could be used to engage with, and to demonstrate their responsiveness and contribution to their community or communities. The diversity of the group of providers, both in relation to how far these issues had previously been considered as well as differences in the types of institution and their geographical location, has meant that the question has been explored from a number of different angles. Purpose of a community scorecard Although the outline purpose of the community scorecard had been defined – to strengthen and demonstrate providers’ contribution and accountability to their communities – the eleven providers’ rationales for taking part in the project covered an array of reasons that reflected their diversity and range.

They all, however, shared the ambition of delivering services to meet community needs and to establish performance measures. Providing evidence to support public relations campaigns, funding decisions, influencing the strategic direction of the organisation, and commercial business expansion were also drivers for engagement in the project. A small group of the providers explicitly saw the development of their community scorecard as a way of strengthening their strategic role in the locality.

Overall, the experience of the providers suggests that if an organisation is to develop a community scorecard, then it needs to be part of a long-term strategy and a real desire for public accountability. The community scorecard has the potential to be an invaluable continuous improvement tool for providers who genuinely seek and act upon community feedback, as well as serving a variety of other uses, from supporting community engagement to PR. Defining the community Defining the community was essential at the outset to enable providers to target their research and seek meaningful views to inform their scorecard. Some providers felt they served one diverse community, whereas others thought they served several distinct communities with differing needs. Some communities were locality-based, others reflected the interests of groups of employers, for example the land-based sector.

Some providers felt that it would be difficult to develop a single community scorecard, even one just for their organisation, and chose to focus their community scorecard on one or two distinct communities that they served, whilst two provider networks explored the possibility of putting in place a single scorecard for the whole network. Securing legitimacy for the community scorecard

Each provider’s journey in developing the scorecard was unique according to the internal and external environments in which they operated. Although securing legitimacy was an area of consideration for all, it was explored in more detail amongst those providers at a more advanced stage in their developmental journey. One provider used the action research project to ‘soft launch’ its community scorecard with the community, but this was the culmination of two years of development within the organisation, including with governors. Individuals engaged in the development of a community scorecard felt that top-level commitment was essential, not least to ensure that sufficient resource was available to develop and maintain the scorecard.

7Shaping the Community ScorecardAction Research Project

Approaches of individual providers

The providers had different starting points and ambitions for their projects. The research was most effective when the provider had a clear view at the outset of their community and their research methodology.

The majority of providers conducted primary research to inform the development of their community scorecard measures using new data sources. However, there was recognition that this was time-consuming to collect and would require significant on-going resource commitment. Others considered an approach that involved existing data sources to some degree.

The level of research expertise differed amongst providers, for instance, in the design of research instruments to provide robust and verifiable results. The amount of resource available also varied widely. Emerging measures

There were differing views from providers as to what type of measures to include in a community scorecard with the intelligence gathered being wide ranging and encompassing perceptions, customer satisfaction, stakeholder views and community needs. Where measures emerged beyond the existing national indicators, they were around measures of community engagement – for example numbers of students volunteering in the community, the number / range of events open to the community; the extent to which patterns of participation could be mapped to areas of disadvantage; and the extent to which the provider’s curriculum offer could be seen to be meeting the needs of the community. One provider also suggested a measure of ‘innovative and visionary leadership’.

Providers recognised the value of having a national set of consistent core measures and spent some time exploring how to capture indicators of learner destinations as part of this. Equally, however, there was recognition of the value of local measures and the suggestion that guidance would be helpful for providers on a spectrum of measures they could consider, linked to the overall goals of their institution.

Publication of results A number of providers considered how they might present their scorecard to the community with a web-based approach being particularly popular. Some providers also considered the extent to which the scorecard could become an interactive tool that invited user feedback. There was also consideration given as to how the scorecard would relate to existing published data.

Some providers expressed concern about striking an appropriate balance between transparency and positive publicity in the presentation of their scorecards, particularly where they were operating in competitive environments.

There was a general consensus that to ensure a scorecard is utilised by both the organisation and its community, it needs to be simple to use, easy to update and relevant.

Support for providers

It is clear from this project that providers looking to embark on the development of a community scorecard, or an approach to demonstrating their contribution to their communities, would benefit from advice and support, which at the very least includes dissemination of the findings from this research to enable providers to learn from one another’s experiences. Other support could include: mapping, analysis and dissemination of other community scorecard development activity, in the FE and skills sector and more widely across other public services; establishing of good practice forums; and development of a community scorecard toolkit. Further areas for possible support are identified in the following section on emerging implications.

8 Shaping the Community Scorecard Action Research Project

Emerging implications and six proposals for policy development and implementation This section builds on the research conclusions and discusses some of the emerging implications for policy development and implementation. Overall, some concerns were raised about the term ‘community scorecard’. There are options for how this work could be described going forward. Although the action research was overseen by the NIPB, alongside its work on FE public information and the ‘Informing Choice Framework’; the shift towards greater freedoms that has emerged in the last year has created greater space and necessity for providers to develop strategic plans that are responsive to their communities. Within this context, new ways of thinking about accountability could assist providers to articulate as fully and transparently as possible their contribution to civic and community wellbeing.

Audiences

Ideas about the purpose of a ‘community scorecard’ varied between the providers and a range of different audiences were identified. The motivations for demonstrating accountability also vary depending on the audience, for example, the reasons for providing information to potential learners will not be the same as those for a provider wanting to make information available to their local authority.

There seem to be two main groups that are external to the provider. The first broad group are ‘customers’. This group includes existing and potential customers (learners, potential learners, employers), and what could be described as ‘the wider learning community’. The wider learning community encompasses organisations and groups that support people to get involved in learning (including community groups, support groups, career guidance professionals).

The second group are ‘strategic partners and civil society’. This group is broad and has increased in salience as the Coalition Government’s policy has emerged, and was therefore not a major focus of the action research. It includes local authorities and other public sector partners such as health and the police; and also private and third sector partners such as local enterprise partnerships, local chambers of commerce, or voluntary, social enterprise sector bodies.

Given its core purpose of human capacity development, the role of the FE and skills sector has relevance for a wide range of stakeholders and strategic players in the locality. It will be beneficial for the sector to develop awareness of its contribution with this wide-ranging audience. Further development of the scorecard concept should therefore explore the scope and priorities of strategic partners and civil society in order to position the sector more centrally in relation to the strategic development of communities

Proposal 1: It is important to recognise the wide range of audiences for information on FE and skills. The sector needs to develop its awareness of the priorities of strategic partners and civil society in order to shape its response to the emerging ‘Big Society’ concept. Presentation of information

Different audiences have different information requirements. For customers, the NIPB’s work to develop a “more comprehensive range of public information”, building on the Framework for Excellence3, is important in terms of securing a nationally consistent, comparable set of information on which learners and employers can take decisions about the types of provision and providers in which they wish to invest.

Other types of information for customers will be particular to different providers. The action research suggests that individual provider surveys of their communities are also helpful in highlighting the sort of local information that different audiences would find useful, for example: indicators on community

3 Skills Funding Agency Framework for Excellence http://ffe.skillsfundingagency.bis.gov.uk/

9Shaping the Community ScorecardAction Research Project

engagement; how well participation reflects the socio-economic make-up of an area; or how effectively a provider’s curriculum offer responds to the priorities of the communities it serves.

Our research suggests that securing accountability to communities needs to be part of a long-term institutional strategy and plan. Measures of community accountability should reflect the overall goals of the institution, and be negotiated with and responsive to their communities, offering a framework in which providers can be publicly accountable for their contribution to social and economic life.

A key challenge for providers will be how to analyse, interpret, use and present an increasingly wide range of ‘public information’ on FE and skills in ways that are authentic and accessible. This challenge also extends to the provision of information for strategic partners. Proposal 2: It would be helpful to develop a spectrum of measures as examples for providers, but ultimately the measures providers select must enable them, with their communities, to assess progress towards and achievement of their goals.

Proposal 3: Sector leaders should consider the new challenges and opportunities of working with an increasingly wide range of public information, and making sense of it with and for different audiences / stakeholders. LSIS could support the sector to explore how information and communication strategies should underpin the shift towards greater community responsiveness.

Proposal 4: The implications of emerging thinking about social productivity4 could be further explored as a possible guiding framework for generating more meaningful metrics to account for providers’ strategic engagement with and contribution to their communities and localities. Dynamic engagement with communities

The devolution of responsibility and greater freedoms at organisational level, together with the emphasis on responsiveness to communities presents challenges to providers to develop more dynamic engagement with their communities, and empower customers and partners to interact with them to co-design services and programmes. Many of the providers involved in the community scorecard research envisaged presenting their information in a web-based format. The potential of technology to implement providers’ information and communications strategies in highly visible and accessible ways is, we know, being explored across the sector. Such approaches could become part of a provider’s overall narrative to both customers and strategic partners, clearly connecting it to the concerns and priorities of the communities it serves.

Proposal 5: Sector providers should explore innovative approaches to developing dynamic engagement with their communities, including how to harness the potential of technology, and the possible economies of scale of working across the sector.

Strategic direction and ownership

There was variation in the level of senior leadership involvement across the providers involved in the action research. Where there was senior leadership the work was more likely to be used to support the strategic planning of the institution.

Proposal 6: Sector leaders should consider the implications of the shift to greater community responsiveness and public information for their approaches to accounting for their contribution to the economic and social life of their communities.

4 2020 Public Service Hub at the RSA; ‘The further education and skills sector in 2020: a social productivity approach’ http://www.lsis.org.uk/2020PSHQ

10 Shaping the Community Scorecard Action Research Project

1. Introduction

In ‘Towards Ambition 2020: skills, jobs, growth’ (October 2009), the UK Commission for Employment and Skills made a series of proposals for “building a more strategic, agile and labour market led employment and skills system”. At the heart of these proposals was an intention to “increase trust in, and authority to, learning providers”. Providing more public information on the performance of providers and increasing their accountability to their communities was seen as one way to do this. The aim was for citizens, communities, employers and members of local public service boards to be empowered to become far more directly involved in shaping outcomes and improving the performance of the employment and skills system and to be better informed to make choices.

LSIS has an interest in, and experience of, supporting and encouraging colleges and providers to play a strategic role in supporting their local communities and agreed to work with the UK Commission to explore options for the development of community scorecards to strengthen and demonstrate providers’ contribution and local accountability to their communities. Alongside this project in 2010-11, LSIS has also supported twenty provider-led Peer Review and Development (PRD) groups focused on the theme of community developments5 and with the sector - is developing a strategic framework for effective community development.

At the same time as the community scorecard project began, in summer 2010, the concept of the ‘Big Society’ was emerging as an overarching framework for the Coalition Government’s high-level principles of freedom, fairness and responsibility. The idea of a ‘community scorecard’ seemed to resonate with the intention to reorientate the accountability of public services away from government and towards customers and citizens. The rationale for the research was to explore some of the ways in which providers could demonstrate accountability to their communities in order to better understand the journey from ‘vertical’to ‘horizontal’ accountability.

The approach adopted was to involve the sector in shaping the development of the community scorecard from the beginning. Participants were invited from FE colleges, work-based, and adult and community learning providers, to join a small action research group. The group of eleven reflected the diversity of institutional and administrative arrangements in different localities, together with the different socio- economic profiles of the communities the sector serves. It also deliberately included providers at different stages of development in their thinking on the community scorecard to gain a realistic and wide perspective on developmental needs. Each provider received a small grant towards their project and all were supported by RCU Research and Consultancy, who coordinated the research project.

The diversity of the sector, reflected in the action research group, underlined the necessity to avoid any prescriptive design for the community scorecard. The purpose of the project was not to develop a blueprint for what the community scorecard should look like, but through an action research approach to explore some of the issues involved in developing ways for providers to provide more information to their communities in order to strengthen and demonstrate their local accountability – “to secure legitimacy for their strategic role and to play their part in local public service coalitions”6. The idea was to provide a spectrum of opportunities from which providers could select the most appropriate mix for their specific context.

5 See the evaluation of the community development peer review initiative at: http://www.excellencegateway.org.uk/page.aspx?o=323173 6 See Annex 1 for the initial LSIS/UKCES outline paper on the community scorecard

11Shaping the Community ScorecardAction Research Project

At the outset7 LSIS and the UK Commission suggested that the purposes of the community scorecard could encompass:

• establishing the fit between the mission of the organisation and local strategic priorities;

• raising awareness of the wider contribution that providers of further education make to securing social and economic well-being in their localities;

• communicating the economic and social returns to learning beyond learning outcomes;

• enhancing the accountability of providers to citizens and communities; • providing a focus for developing leaders of learning and skills as strategic partners in local public consultation; and • providing a stimulus to develop partnerships for better integrated delivery, efficiencies, and improved outcomes for learners. The project ran between September and December 2010, in parallel with the emergence of the new government’s approach to public service reform and its implications for FE and skills. These have included a stronger emphasis on the sector developing a more comprehensive range of public information through the National Improvement Partnership Board (NIPB), a focus on localism, and in the context of greater freedoms and flexibilities a strong steer to the sector to develop its responsiveness to its communities and its networks of partners. During this period a set of national indicators and measures has also emerged, creating new drivers from the centre of the system8; and towards the end of the project the Framework for Excellence data was published9. Some of the opportunities and tensions between these various developments are reflected in the challenges the providers faced in establishing their community scorecards in a rapidly evolving context. In particular, the determination of the Coalition Government to devolve greater responsibility to frontline providers and to emphasise the need for accountability to their communities and localities became apparent late in the action research. The action research was not, therefore, able to explore fully the implications of this shift.

This report outlines the experiences of the eleven action research providers and identifies some overall research conclusions from the project. It also considers the emerging implications of these conclusions for policy development and implementation.

7 See Annex 1 for the initial LSIS/UKCES outline paper on the community scorecard 8 BIS Structural Reform Plan - Impact Measures (October 2010) and SFA Guidance Note 6 (December 2010) 9 http://readingroom.lsc.gov.uk/SFA/FfE_Summary_Statistics_2010.pdf

12 Shaping the Community Scorecard Action Research Project

2. Policy context

The community scorecard idea originated as part of the UK Commission for Employment and Skills report ‘Towards Ambition 2020: skills, jobs, growth’ in October 2009 which recommended the need for a more strategic and agile labour market led employment and skills system which would encourage increased trust in learning providers. By developing a community scorecard, providers would give the public relevant information on their performance and this would in turn increase their accountability to the communities they serve.

The report called for publicly-funded institutions to be more accountable to individuals, employers and communities through a system of quality labelling for public services. It went on to propose that this should include outcomes / destinations, customer satisfaction and quality indicators “balanced against a profile of the economic, social and labour market characteristics of the area in which a provider operates”. It anticipated the Coalition Government’s policy shifts, which have emphasised the importance for providers of ensuring that citizens’, communities’ and employers’ voices shape what they deliver and how they deliver it, empowering providers to offer responsive provision that fits with the priorities and patterns of social and economic life.

At a recent LSIS seminar, Maggie Galliers CBE, principal of Leicester College and an early adopter of the community scorecard ideas, explained it like this: “In order for people to better understand the contribution FE can make, they need to understand what it does... and so FE needs to be more transparent and accountable to the public. That should then become the basis for a dialogue about what society values and what it wants FE to do – recognising that it can’t do everything; enabling FE to make an effective contribution to the Big Society.”

The emerging Big Society concept has provided a backdrop to the action research on the community scorecard. Although still coming into focus and not without controversy, it is acting as a framework for the Coalition Government’s high-level principles of freedom, fairness and responsibility, principles that are driving reform in wider public service, as well as FE and skills.

For the FE and skills sector, the skills strategy published in November 2010 made clear the government’s intention to reorientate the accountability of the sector away from government and towards customers, citizens and employers: “Colleges and training organisations must respond to the real demands demonstrated by employers and learners.

“[Providers] should engage with local partners – consulting them on their business plans so that providers can set out how their offer...contributes to local economic priorities.

“Development of a more comprehensive range of public information will be published in January 2012.”10 The development of a “more comprehensive range of public information” has been led by the sector, through the National Improvement Partnership Board, which has also taken an overview of the progress and outcomes of the community scorecard action research project. It builds on the Framework for Excellence data, published by the Skills Funding Agency towards the end of the community scorecard action

10 Strategy document: Further Education New Horizon Investing in skills for sustainable growth (November 2010) Department for Business Innovation and Skills http://www.bis.gov.uk/assets/biscore/further-education-skills/docs/s/10-1272-strategy-investing-in-skills- for-sustainable-growth.pdf

13Shaping the Community ScorecardAction Research Project

research project in December 2010, the intention of which is that “giving learners and employers better information will drive a ‘step change’ in quality improvement”. The SFA describes the shift in the emphasis of accountability as being “founded on the relationship between provider and customer rather than provider and government”. The government has from the start set its face against centralised performance management systems for holding the sector to account. There has been clarification that the Ofsted inspection regime will be simplified and less frequent for successful providers, and, working with the sector, the Department for Business, Innovation and Skills is engaged in further work aimed at streamlining the performance architecture of the FE and skills system. Strengthening providers’ capacity to understand their communities’ needs, to effectively respond to those needs – even where they may be in tension with national policy priorities – and to be able to account transparently to their communities for their contributions, will become key skills in the new operating environment. Indeed, Lifelong Learning UK’s 2010 review of workforce skills11 suggests providers are already recognising this, citing “partnership working including engagement with communities” as a skills priority for the sector.

Defining ‘community’ There is a strong emphasis in recent announcements from BIS that ‘community’ is being used in its widest sense, to include the sectoral interests of localities and employers.

These interests come together around the agendas for Local Enterprise Partnerships in which, so far, skills appear to be a high priority but engagement with the FE and skills sector less so. With the changes to sub-national economic planning structures and the policy emphasis being placed on the sector’s contribution to stimulating and supporting economic growth, providers’ ability to show their responsiveness to the emerging priorities for Local Enterprise Partnerships will be an important dimension of the sector’s strategic role.

Other aspects of the Coalition Government’s localism agenda also resonate with the intentions of the community scorecard to enco urage greater accountability to communities – particularly localities. Some elements of this programme continue the trajectory of the previous government – a less top-down approach, more citizen-driven services and an emphasis on the use of markets and competition. However, the current emphasis appears to be on decentralising power down to the lowest possible level, and this move to accelerate the role played by local communities is now being translated into law through a wide-ranging Bill12. In addition to the provisions to create local enterprise partnerships, other measures in the Bill strengthen the framework for local accountability by: • giving councils a general power of competence through which they will be able to set up businesses and innovate in new ways including developing shared services, employee mutuals, outsourcing and widespread commissioning of functions;

• empowering local people through giving residents the power to instigate local referendums on any local issue, and voluntary and community groups the right to challenge local authorities over their services;

• allowing the Secretary of State to put in place shadow elected mayors in 12 city councils and to initiate a mayoral referendum; and

• abolishing the existing ethical governance framework, replacing this with a duty to promote and maintain high standards of conduct and a power to adopt a voluntary code of conduct.

11 http://www.lluk.org/wp-content/uploads/2011/01/Final-SSA-2010-UK.pdf 12 Decentralisation and the Localism Bill: From Big Government to Big Society. A guide to the Bill is available at http://www.communities.gov.uk/documents/localgovernment/pdf/1793908.pdf

14 Shaping the Community Scorecard Action Research Project

LSIS has an interest in and experience of supporting and encouraging colleges and providers to play a strategic role in supporting their (local) communities and agreed to work with the UK Commission on the development of a community scorecard to strengthen and demonstrate providers’ contribution and local accountability to their communities. The initial scoping of the project13 was based on a review of the learning from previous policy research at LSIS, including: • the Total Place pilots – LSIS commissioned and worked closely with the Association of Colleges (AoC) to explore the involvement of colleges in the pilots;

• the new duty on colleges to promote economic and social wellbeing in their localities – LSIS was commissioned by BIS to develop guidance for the sector;

• opening up college premises to the community to support informal learning – LSIS was commissioned by BIS to produce a good practice guide;

• tackling and preventing gun, knife and gang-related problems in the sector - LSIS has produced a good practice website to support the sector; and

• LSIS’s support programme for equalities, diversity and community development.

13 See Annex 1

15Shaping the Community ScorecardAction Research Project

3. Methodology

LSIS invited ‘pilots of the willing’ in summer 2010 through its standard open invitation process to sector providers. A number of responses were received and reviewed by LSIS and the UK Commission. The decisions took into account the need to have a range of different types of provider involved in the project from different parts of the country, and at different stages of development with their thinking about the community scorecard to enable a spectrum of developmental issues to be considered.

The successful providers were:

• Leicester College (East Midlands) • Petroc (South West) • Stockton Riverside College (North East) • Barton Peveril College (Sixth Form) (South East) • Otley College (Land-based) (East) • In Touch Care Limited (Yorkshire and the Humber) • GDR Solutions (UK) Limited (South West) • Black Country Training Group (West Midlands) • The Sellafield Ltd Unionlearn Centre (North West) • Wandsworth Council Lifelong Learning (London) • The Co-operative College (North West)

RCU was commissioned by LSIS and the UK Commission to provide informed and experienced consultancy support to help the eleven providers deliver an action research project that explored and helped to establish the purpose of a community scorecard, what it might look like, and how it might be used. The research took place between September and December 2010, with providers reporting in January 2011 and RCU final summative reporting in February 2011.

The research methodology was a five stage approach as follows: Stage 1 – First seminar event: an opportunity for the participating providers to hear first-hand the policy context from both the UK Commission and LSIS on the community scorecard concept. Leicester College also shared their community scorecard journey to date. Providers’ initial discussions explored their different definitions of community, existing measures for evaluating impact, and their initial thoughts on the purpose and scope of their individual research projects.

Stage 2 – Developing providers’ project plans: RCU mentors visited each provider to help them develop a project plan including aims, objectives, senior in-house sponsor and proposed research approach, with timescales and milestones. The purpose of the project plan was to provide a framework within which providers could undertake their action research and also to form the basis for progress review.

Stage 3 – Mentoring, reporting and keeping in touch: there was regular contact between RCU mentors and the providers during the project. Providers supplied fortnightly reports to RCU detailing their progress. The reporting approach was flexible but provided an important mechanism to capture key learning, which was later reflected in providers’ final reports.

A workroom was also set up on the Excellence Gateway, into which provider project plans, reports and project communications were posted.

16 Shaping the Community Scorecard Action Research Project

Stage 4 – Second site visits and seminar event: RCU mentors visited providers prior to the second seminar event to review research outcomes, support providers in the analysis of results and identify any potential measures that could be included with a community scorecard.

At the second seminar, attended by the UK Commission, LSIS, RCU, the participating providers and representatives from the Skills Funding Agency, each provider delivered a presentation on their project which included: the purpose of their community scorecard and key research activities undertaken; lessons learned; and next steps. This was followed by action learning sets which explored:

• how well providers’ projects progressed against the original project plan;

• what worked well and which particular factors / levers were most useful / successful;

• what had not worked well, why, and what had been the impact of these issues;

• what providers had achieved to date and what were their emerging findings;

• what more needed to be done;

• whether providers had been able to identify some suitable community scorecard measures;

• whether providers had been able to test the relevance of any community scorecard measures with the community;

• whether individual providers’ projects would continue beyond the period of the research;

• how providers had communicated their research internally and externally;

• whether Governors or Trustees had been involved and in what capacity; • whether providers would choose to develop a community scorecard based on what they had learnt through the research;

• what factors should be considered for any extension of the community scorecard concept; and

• whether the development of a community scorecard would be considered a priority issue for the management team in their organisation.

17Shaping the Community ScorecardAction Research Project

Clarity Scope Measurability Verifiability Transferability Utility

• From what you have learnt so far, to what extent is a community scorecard an effective way to communicate the vision and mission of the provider? • How effective is a community scorecard in clearly articulating how a provider gives value to a community? • Are there any essential elements to be included or considered? • How useful is a community scorecard for your particular organisation? • What are its strengths and what are its limitations in your circumstances? • Does the community scorecard need to include a range of measures / indicators, or is it more suitable for it to focus on a particular issue / aspect? • To what extent does / will your community scorecard include quantifiable measures that allow stakeholders to make a judgement about you and your provision? • How straightforward was it to develop your community scorecard and can it easily be developed to meet changing needs or circumstances? • How useful do you feel the community scorecard is to other organisations similar to yours? • Are there certain types, size and location of provider where a community scorecard is likely to be more suitable? • To what extent do you think the community scorecard is of interest to other key stakeholders such as local authorities, public bodies, community and voluntary groups and employers?

Stage 5 – Provider reports and summative report: Following the discussions at the seminar, providers were asked to submit final reports building on their presentations and the action learning / critical reflection undertaken. This final report is based on information from those reports.

Critical reflection was also undertaken on the wider implications of the community scorecard based on the following:

18 Shaping the Community Scorecard Action Research Project

4. Research conclusions

Introduction This section discusses the research conclusions from the project overall, based on the individual reports from the eleven providers involved in the action research. For ease of reference, the providers involved are listed again below:

• Leicester College (East Midlands) • Petroc (South West) • Stockton Riverside College (North East) • Barton Peveril College (Sixth Form) (South East) • Otley College (land-based) (East) • In Touch Care Limited (Yorkshire and the Humber) • GDR Solutions (UK) Limited (South West) • Black Country Training Group (West Midlands) • The Sellafield Ltd Unionlearn Centre (North West) • Wandsworth Council Lifelong Learning (London) • The Co-operative College (North West)

Each provider’s report included: • Details of their own experiences of developing a community scorecard: rationale, purpose, project aims, methodology, definition of community, legitimacy of the scorecard, project outcomes, lessons learned and critical success factors, communication of the scorecard, and next steps;

• Critical reflections on the wider implications of a community scorecard for a range of provider types and delivery settings; and • Factors to be considered by other providers looking to develop a community scorecard together with any further support that might be required to assist them. It is worth noting that as a result of this action research, all eleven providers started to identify and test the critical factors that need to be considered when developing a community scorecard. However, the extent to which each provider reached the stage of identifying measures and producing a community scorecard within the short project timescale varied, depending upon.

The key findings from each project have been distilled into an overall set of eight research conclusions that cover:

• the range of starting points for a diverse sector; • the purpose of a community scorecard; • defining the community; • securing legitimacy for the community scorecard; • approaches taken by individual providers;

19Shaping the Community ScorecardAction Research Project

• emerging measures; • publication of results; and

• support for providers.

The range of starting points for a diverse sector The diversity of the sector, reflected in the providers involved, is a constant refrain through these research conclusions.

Within the group of providers engaged in the action research, there were three general FE colleges serving a multitude of community groups with diverse and wide-ranging needs. These colleges particularly wanted to explore how a community scorecard could adequately reflect the impact they were having on their communities and to identify areas where there was scope for improvement.

A number of the participating providers served predominantly rural areas, including one specialist land-based college, a general FE college and a work-based learning provider. These providers wanted to ascertain how they could develop a community scorecard to reflect the particular needs of their learners, employers and community groups spread across a wide and often remote geographic area. These providers set out to develop some additional (not replacement) performance measures that would reflect the needs of isolated rural groups. For these providers, the community scorecard was viewed as potentially offering a more appropriate measure of their progress towards delivering their missions, which generally reflected ambitions to achieve learning excellence within the rural communities (including business communities) that they serve, rather than centralised performance measures.

Other providers’ were keen to test the suitability of a community scorecard in verifying the impact they were having within their own organisational structures. For example, the Co-operative College wanted to explore how a community scorecard could be adopted to reflect the added value of Co-operative schools by using comparator measures such as attendance, attainment and a reduction in vandalism to review performance against levels prior to the school becoming a co-operative or against similar schools in their location. The Black Country Training Group – a network of work-based learning providers – sought to establish whether they could develop a single community scorecard that could reflect the combined impact of all the group’s members, comprising over forty training providers and voluntary groups across the West Midlands.

Other providers saw the community scorecard as a means of raising awareness of services offered and as a way to gather intelligence from potential users to inform future services and support. These providers also hoped to evidence positive impact on their service users. Their aim in developing a community scorecard was not only to measure their impact on the communities they serve, but also to give evidence to their funding bodies or senior management to justify continuation of service delivery.

One provider used the community scorecard research as a means of gathering information on their local community impact to support a new business strategy. The early adopter, Leicester College, focused on a ‘soft launch’ of their already developed community scorecard, and on gathering community feedback.

The purpose of a community scorecard Although the outline purpose of the community scorecard had been defined – to strengthen and demonstrate providers’ contribution and local accountability to their community – the eleven providers had a range of reasons for taking part in the project.

They all, however, shared the ambition of delivering services to meet community needs and to establish performance measures for this work. Providing evidence to support public relations campaigns, funding

20 Shaping the Community Scorecard Action Research Project

decisions, influencing the strategic direction of their organisation and commercial business expansion was also a driver for engagement in the project. A small number of the providers saw the development of their community scorecard explicitly as a way of strengthening their strategic role in the locality. There was no single consensus about the purpose of a community scorecard by the providers. Whilst the freedom to shape the concept was warmly welcomed and seen as exciting at the beginning of the project, it actually caused some delays in executing the primary research because a large number of the providers found it difficult to establish a clear purpose for their community scorecard. Indeed, even by the end of the action research phase there were a number of providers who had defined only an outline purpose. There were also a number that shifted their purpose during the course of the action research. Not surprisingly, the actual purposes of the providers’ scorecards mirrored their rationale for being involved in the action research. These were:

• to gauge how effective the community scorecard is as a tool for measuring an institution’s impact on a wide-ranging and diverse set of communities and locations;

• to establish the suitability of a community scorecard in offering an alternative set of performance measures that better reflect the operating circumstances of the provider;

• to measure current public opinion of the organisation, establishing whether stakeholders feel suitably informed about the organisation’s contribution to the community and ascertaining whether differences in opinion and knowledge exist across diverse community groups;

• to encourage non-learners to participate in learning by better understanding their needs and those factors that influence their decision to become engaged;

• to develop an evidence-gathering tool that acts as a two-way communication system between the general public and the organisation;

• to showcase what the provider can offer and gather feedback on whether this is what the community wants;

• to provide a framework for Co-operative schools to measure the impact of those aspects of their school organisation that are influenced / driven by co-operative values and principles; and

• to show how a college serves its local community and demonstrates its accountability to that local community by publishing key information about its performance. Some providers sought the views of internal staff to inform the purpose, whereas others found it difficult to secure buy-in from within their organisation, including at a senior level. Others consulted with groups in their community to shape the development of their scorecards, with mixed results. However, the clear message from this group was that to be of any value, the scorecard must be meaningful to the communities they serve and not be perceived as a national imposition.

Providers also grappled with how the community scorecard would complement existing public information sources such as Framework for Excellence, Ofsted inspection reports and success rates. The shifts in policy around these existing information sources, and the publication of the Framework for Excellence data the day before the second project seminar in December all highlighted the tensions in this important but still emerging area of work.

The result of these deliberations was that some projects were small-scale in nature, potentially making it difficult to influence the culture change required for their organisations to secure true accountability to their communities. Overall, the experience of providers suggests that if an organisation is to develop a community scorecard, then it needs to be part of a long-term strategy and based on a real desire for

21Shaping the Community ScorecardAction Research Project

public accountability. It has the potential to be an invaluable continuous improvement tool for providers who genuinely seek and act upon community feedback, as well as serving a variety of other uses, from supporting community engagement to PR.

Defining the community Defining the community was essential at the outset to enable providers to target their research and seek meaningful views to inform their scorecard. Some providers felt they served a diverse community, whereas others thought they served several distinct communities with differing needs.

Some providers decided to define community by geography; others felt customers constituted their community. A small number took a broader approach to encompass ‘users’, ‘non-users’ and ‘stakeholders’. However, for most the key community focus for this project was service users rather than wider stakeholders and other public bodies. This may have been due in part to the time constraints of the project, or the general turbulence in the public sector landscape in autumn 2010.

Some providers changed their perspectives on the scope of their definition of community in relation to this project as their action research progressed, particularly as they realised the scale of what they were planning to undertake within the timescale of the project. This resulted in some providers defining one or two distinct communities within their overall community to research as part of the project. This was often driven by a key strategy that the provider wished to pursue, for instance, increasing adult learner recruitment or employer engagement, engaging with consortium partners, developing a new venture or to gauge perceptions of the provider following a merger. On the other hand, two provider networks explored the possibility of putting in place a single scorecard for the whole network.

Securing legitimacy for the community scorecard Each provider’s journey in developing the scorecard was unique according to the internal and external environment in which they operated. Although legitimacy was an area of consideration for all, it was explored in more detail amongst those providers further along the developmental journey. Leicester College, for example, used the action research project to ‘soft launch’ its scorecard with the community, but this was the culmination of two years’ development within the organisation, including with governors. Leicester College

The College has developed its scorecard through the involvement of a number of key stakeholders over a two-year development period, including with student liaison committee and attendees at the College’s Annual General Meeting, and with the College governors being closely involved throughout and providing the impetus for development of the scorecard.

The focus of their action research project was to build on work already undertaken, by developing a web-based version of the scorecard and conducting a ‘soft launch’ through dissemination by day-to-day contact and a direct mailing of key partners. The reason for the ‘soft launch’ was to provide an opportunity to refine and further develop the scorecard prior to re-launch in 2011.

The College monitored website activity, recording 860+ hits, with 420 unique visits during the project. The College received “overwhelmingly positive comments” from the email web-link sent to key partners; however, feedback invited using a feedback button on the website elicited no response. The College subsequently identified that it still had more to do to engage the local community with the scorecard and is looking to embed it within the College’s wider communication channels.

22 Shaping the Community Scorecard Action Research Project

Support for the development of a scorecard varied between providers with some having top-level governor and senior management team endorsement. Others restricted their community scorecard development to their areas of influence, almost working in isolation from the rest of the organisation, success with the project being as a result of the commitment and hard work of small teams (often 1 or 2 individuals) rather than a strategic imperative. However, the individuals engaged in the development of a community scorecard felt that top-level commitment was essential, not least to ensure that sufficient resource was available to develop and maintain the scorecard.

Reasons for generally low levels of senior buy-in are unclear, although the lack of clarity around providers’ purposes in pursuing the scorecard could be a factor. Also, there was uncertainty regarding the relationship with existing datasets and how the community scorecard would sit alongside other public information such as Framework for Excellence and Ofsted reports. The project was conducted in autumn 2010 at a time when the implications of the Coalition Government’s approach to public service reform were only beginning to emerge. There was uncertainty about the new balance between national and local accountability and the relationship to future funding, and perhaps a reluctance to move too far from a focus on known performance measures.

The tensions between transparency and the importance of good publicity were felt by some to be a factor, particularly in areas of competition. There was also some anxiety about potential risks to the community profile of institutions raising public expectations through community engagement / consultation during a period of austerity.

Approaches by individual providers As has already been said, the providers all had different starting points and ambitions for their projects. The research was most effective when the provider had a clear view at the outset of their community and their research methodology and the project was managed within the context of measuring the contribution that the provider made to the community. Some projects ‘piggy-backed’ on other initiatives although this was generally found not to be an effective approach to developing a scorecard.

Many providers had originally set a broad definition of community and as a result the initial research methodologies were wide-ranging and perhaps on reflection over-ambitious. Once the project commenced and the scale of the challenge was mapped out, it became clear to some that it would be impossible to include all communities within the scope and timescale of the project. Therefore some providers, such as Stockton Riverside College, identified a number of priority community groups, for example workless people and those in need of employability skills, as the focus for their research. Once this more precise definition had been agreed, often the scale of the task became much more manageable.

Other providers took a much more pragmatic approach. Before embarking on gathering hard evidence of their community contribution, they spent the research phase holding internal meetings to explain the purpose of the community scorecard and secure the support and buy-in of internal management teaching staff, and followed this up with internal focus groups to define the community and agree which community groups were to be the audience for the scorecard. Others focused on gathering the views of their member organisations, key external stakeholders and local opinion formers.

The level of research expertise differed amongst providers, for example in the design of research instruments to provide robust and verifiable results. The amount of resource available also varied widely.

Broadly, large general FE colleges were more ambitious in terms of their aims and objectives and tended to define their community more broadly for research purposes. These providers planned to gather views across their communities and then disaggregate the views through their proposed research methodologies. Two of the general FE colleges had research specialists involved in the project and these organisations were experienced in gathering and sharing information. Leicester College, for example, had segmented its community into different user groups and used existing and publicly available information to gauge its contribution to the local community.

23Shaping the Community ScorecardAction Research Project

Petroc

The action research project was timely for Petroc College as they were about to embark on a major strategic plan consultation exercise and opted to add a very short survey asking people to rank their perceptions of how the College contributed to the community as part of that process. The consultation was web-based and the College sent a direct email to key stakeholders, college staff, recent and current learners, schools, local authorities and elected members, totalling 20,000 individuals. In addition, the local paper ran a front page story covering the College’s strategic vision and included the web address where individuals could become involved in the consultation. Unfortunately, the level of response was low – but that in itself was an important learning outcome for the college. Some of the smaller providers, particularly from the vocational learning sector, found the action research approach more challenging, partly due to inexperience in this area and a lack of knowledge of robust research methodologies.

Interestingly, the majority of the providers’ methodologies focused on conducting research to gather primary data to inform their scorecard measures, rather than utilising existing data sources. This could be attributed to the fact that providers were keen to develop a community scorecard with a local flavour and wished to take the opportunity afforded through this project to test out new approaches. It may also reflect the timing of the project: most of the research was completed before the Framework for Excellence data was published in December 2010. There was recognition, however, that conducting primary research was time-consuming and could require significant on-going resource commitment. A small number of providers considered an approach that involved using existing data sources.

The learner, or potential learner, was rarely the focus of the primary research – where research was conducted with ‘users’, the responses were higher than research approaches that sought to involve ‘non-users’. This could be a reflection of the lead-in time required to engage these groups in research and the learning that providers would have to undertake before they were able to do this effectively. It also reflects that an organisation needs to feel confident, when inviting community views, that it has a clear strategy in place which:

• has a clear purpose;

• will not raise expectations that cannot be met;

• asks relevant questions;

• stimulates a valid and robust response; and

• does not place the organisation in a difficult situation whereby it may have to publish or respond openly to negative perceptions and views.

Many of the providers had only begun to undertake preliminary research during the duration of the project and would require further research if they were to develop a full scorecard.

Emerging measures There were differing views from providers as to what type of measures to include in a community scorecard, with the intelligence gathered being wide-ranging and encompassing perceptions, customer satisfaction, stakeholder views and community needs. Some also perceived difficulties in establishing robust measures that could be replicated.

24 Shaping the Community Scorecard Action Research Project

Not all providers had time to define in detail the measures that would constitute their scorecards. However, a number of providers did identify some top-level measures that could be further refined including: reputation; sustainability; community engagement (such as volunteering at events); quality; customer feedback / satisfaction; learner support; value for money; learners recruited from disadvantaged groups / areas; and curriculum meeting community needs. One provider also suggested a measure of ‘innovative and visionary leadership’. Some of these measures reflect existing national indicators, but others did go beyond that, in particular community engagement (for example, numbers of students volunteering in the community, or the number / range of events open to the community); the extent to which patterns of participation could be mapped to areas of disadvantage; and the extent to which the provider’s curriculum offer could be seen to be meeting the needs of the community. Some providers started to reflect on how ‘soft outcomes’ – for instance user feedback – could be measured and form part of the scorecard, but this was felt to be more challenging and there was insufficient time to consider this fully within this project.

Some providers adopted a hybrid approach using a combination of new and existing data, thereby enabling the scorecard to be individualised, whilst at the same time complementing existing public information such as Framework for Excellence.

Three provider case studies illustrate the types of measures that might be included within a community scorecard. Barton Peveril College

Barton Peveril College set out to consult with delivery staff, learners and stakeholders on what a community scorecard might look like, how it could be used to the advantage of the college and how measures could be developed. The focus was adult and community learning.

Staff were initially consulted and as a result proposed measures identified. The measures were then incorporated into a questionnaire that was used to consult with the community.

Measures identified were: • childcare facilities • community involvement • cost of course • exam results • learner support • location and access • environmental friendliness • Ofsted • refreshments • reputation

The research indicated that the measures of cost of course, exam results, location and access, and reputation were particularly important to stakeholders.

25Shaping the Community ScorecardAction Research Project

Wandsworth Council Lifelong Learning

Wandsworth Council set out to review its community development frameworks and to establish whether it was collating the correct evidence to judge the effectiveness of its community development provision and if it was judging the impact of its community development provision from a sufficiently wide perspective. The research was mainly undertaken through existing groupings. Wandsworth Council was also involved in the LSIS PRD community development project.

The Council’s mission and strategic priorities, together with its defined priority groups, showed how identifying one community was impossible as the Council served many and varied communities. The focus for this project was taken to be stakeholders in adult and community learning, with learners at the centre and the provider, the community and employers also represented.

Indicators were considered for each of the key stakeholder groups.

The learner: • percentage of learners achieving their learning outcome • percentage of learners progressing onto a higher level of learning • percentage of learners gaining a job (previously unemployed) • percentage of learners gaining a promotion / changing jobs with their existing employer • percentage of learners starting a new business / developing their existing business • percentage of learners who gained voluntary work experience • percentage of learners being more involved in their community

The provider: • percentage of learners aged 60+ • percentage of learners from disadvantaged areas • percentage of learners with below level 2 qualifications • percentage of learners from BME groups • percentage of male learners • percentage of learners with disabilities / learning difficulties • percentage of learners studying Skills for Life (literacy, numeracy and ESOL) • percentage of learners retained on the course.

The community: • percentage of learners volunteering with a local community group • percentage of learners becoming involved with their children’s school • percentage of learners who have gained voluntary work experience • percentage of learners being more involved in their community (e.g. joining coffee mornings, meeting neighbours in groups, joining the neighbourhood watch) • percentage of families becoming more involved in the community / school. The employer: • employees who have increased responsibility in the workplace as a result of their learning • employees who have gained promotion as a result of their learning •employees who have gained skills through acting as a workplace supervisor.

26 Shaping the Community Scorecard Action Research Project

Black Country Training Group (BCTG)

BCTG set out to develop an overarching community scorecard on behalf of all its training provider members that would show what end-users think of the service provided. Partners were consulted on a number of proposed measures that included:

• quality – service quality standards and quality reports; • learner voice – success stories, learner feedback, surveys and questionnaires; • support – value for money, pastoral care, and information, advice and guidance; • employer feedback. Providers were generally keen to develop a community scorecard that was relevant to their organisation and the community it serves, feeling it was important that it was seen to be owned and driven by themselves and their community rather than as a national imposition.

Certainly at the outset of the project, the invitation to providers to determine their own scorecard utilising whatever measures they felt appropriate was a key motivating factor, although one of the key learning points arising from the project was that this can be both challenging and time-consuming. The publication of Framework for Excellence data just prior to the December seminar served to prompt providers to consider whether a community scorecard framework with some guidance might provide a useful tool for other providers, perhaps extending so far as to incorporate some core measures, as well as the flexibility to design their own measures.

Publication of results A number of providers considered how they might present their scorecard to the community, with a web-based approach being particularly popular. Some providers also considered the extent to which the scorecard could become an interactive tool that invited user feedback, but it was unclear from this project how such feedback would be managed, particularly negative messages.

There was also consideration given as to how the scorecard would relate to existing published data and whether, for instance, it should become a central data dashboard with signposting to other sources of information. Others had yet to decide whether they wished to communicate their research findings at all, and a small number were concerned about how to manage potentially conflicting messages, for example, where public information seemed at odds with community views.

27Shaping the Community ScorecardAction Research Project

Support for providers Providers participating in the project generally made significant steps towards either the development, or enhancement, of their scorecard approach. However, it was found to be overly ambitious for providers to conduct primary research and develop a community scorecard from scratch within the project timescale; further time and resource would be required by some to ensure that the scorecard continued to be implemented.

A number of key stages were identified in the journey to develop a community scorecard:

• securing senior management commitment; • integrating within strategic plans; • defining the community; • seeking community views, • developing measures; • testing measures; • publishing the scorecard in a user-friendly format; and • continuous improvement and review.

Although none of the participating providers completed all these stages, they all made progress on their individual community scorecard journeys. The development of a scorecard that is relevant both internally and externally requires time and effort; and similarly, maintaining a scorecard also requires resource commitment. It is clear from this project that providers looking to embark on the development of a community scorecard, or more generally to devise approaches to demonstrating their contribution to their communities, would benefit from advice and support which at the very least includes dissemination of the findings from this research to enable providers to learn from one another’s experiences. Other support could include:

• mapping, analysis and dissemination of other community scorecard development activity – both in the FE and skills sector, and more widely across other public services;

• establishing of good practice forums where providers can share their experiences, perhaps through the Excellence Gateway; and

• development of a community scorecard toolkit that could provide guidance on topics such as research techniques and the relative merits of utilising existing and new data sources.

28 Shaping the Community Scorecard Action Research Project

5. Emerging implications and six proposals for policy development and implementation

Introduction The intention of the action research project was to give a group of providers the space to explore what information could be used to engage with, and demonstrate their responsiveness and contribution to their community or communities. The diversity of the group of providers, both in relation to how far these issues had previously been considered as well as differences in the types of institution and their geographical location, has meant that the question has been explored from a number of different angles.

The research conclusions outlined in the previous section highlight the richness of the learning that this approach gave. This section builds on the research conclusions and draws out some of the emerging implications for policy development and implementation and makes six proposals.

Overall, some concerns were raised about the term ‘community scorecard’. There are options for how this work could be described going forward – although the action research was overseen by the NIPB, alongside its work on FE public information and the ‘Informing Choice Framework’; the shift towards greater freedoms that has emerged in the last year has created greater space and necessity for providers to develop strategic plans that are responsive to their communities. Within this context, new ways of thinking about accountability could assist providers to articulate as fully and transparently as possible their contribution to civic and community wellbeing.

Audiences Ideas about the purpose of a community scorecard varied between the providers and a range of different audiences were identified. The motivations for engaging with a community scorecard and the benefits of doing so also vary depending on the audience – for example, the reasons for providing information to potential learners will not be the same as those for a provider wanting to make information available to their local authority.

There seem to be two main groups that are external to the provider. The first group are ‘customers’. This group includes existing and potential customers (learners, potential learners, employers), and what could be described as ‘the wider learning community’. The wider learning community encompasses organisations and groups that support people to get involved in learning (including community groups, support groups, career guidance professionals).

Individuals and groups in this category need information to use to make decisions or to help other people make decisions. Informing choice by making information publicly available will be increasingly important as public service reform seeks to empower citizens and communities, and as the balance of responsibility for paying for provision changes. If people are asked to invest financially in their learning it seems logical to assume that they will demand more information on which to base their decisions. The benefits for providers of offering more information to customers include increased recruitment as more people understand what the provider can offer, and improved retention and success as people use information to make better decisions about which courses are right for them.

The second group are ‘strategic partners’. This group is broad. It includes local authorities and other public sector partners such as health and the police, and also private and third sector partners such as Local Enterprise Partnerships, local chambers of commerce, or voluntary, social enterprise sector bodies. Given its core purpose of human capacity development, the role of the FE and skills sector has relevance for a wide range of stakeholders and strategic players in the locality. It will be beneficial for the sector to develop

29Shaping the Community ScorecardAction Research Project

awareness of its contribution to this wide-ranging audience. As more responsibility is devolved, it will be important that strategic partners in a local area understand and recognise each others’ roles to enable them to plan and work effectively in partnership. In a time of limited funding, this will be particularly important as partners look for ways to secure the maximum value from the collective investment in an area. Further development of the scorecard concept should therefore explore the scope and priorities of strategic partners and civil society in order to position the sector more centrally in relation to the strategic development of communities.

Some of the providers also highlighted the importance of internal audiences and the value of the scorecard in informing their strategic planning process. There was not a single internal audience, so a community scorecard could support:

• the principal / chief executive / head of service / governors / trustees to make decisions about their organisation’s strategic direction;

• curriculum heads in making decisions about the mix of provision that can best meet the needs of the local community; • quality improvement managers reviewing the overall responsiveness and performance of the organisation; and

• marketing departments in recruitment and other campaigns.

Proposal 1: It is important to recognise the wide range of audiences for information on FE and skills. The sector needs to develop its awareness of the priorities of strategic partners and civil society in order to shape its response to the emerging ‘Big Society’ concept.

Presentation of information Different audiences have different information requirements. For customers, the NIPB’s work to develop a “more comprehensive range of public information”, building on the Framework for Excellence, is important in terms of securing a nationally consistent, comparable set of information on which learners and employers can take decisions about the types of provision and providers in which they wish to invest.

Other types of information for customers will be particular to different providers: for example, arrangements for the new discretionary support funds; details of facilities, resources and materials available; information about fees (including fee loans in future); and promotional offers. York Consulting, for the NIPB, has been looking at the different types of information valued by learners and employers and this will be a good source of evidence for providers wishing to develop their own provision of information.

In addition, we note the new measures included in the Structural Reform Plan for BIS which may in turn become part of a wider set of public information: in particular, the proportion of college leavers who go into employment or further training and are still there after 6 months; the proportion of young people from disadvantaged backgrounds who go on to higher education by age 19; and the number of apprenticeship places.

Our research suggests that securing accountability to communities needs to be part of a long-term institutional strategy and plan. Measures of community accountability should reflect the overall goals of the institution, negotiated with and responsive to its customers, community(ies) and strategic partners. Experience from the community scorecard projects suggests that individual provider surveys of their communities may be helpful in highlighting the sort of local information that customers want.

Strategic partners will be interested in the consistent, comparable information available through the framework of public information, but often through a local, community-wide, or sectoral lens. So a local authority and its partners might want to know, not only that a provider performs well on the provision

30 Shaping the Community Scorecard Action Research Project

of long vocational courses, but that these courses serve the key sectoral or occupational priorities of the local economy; or that a provider is investing in engagement provision in a community with poor health outcomes or a history of community tension14. The narrative will be different for different providers serving different communities.

A key challenge for providers will be how to analyse, interpret, use and present an increasingly wide range of public information and evidence on FE and skills in ways that are authentic and accessible. The growing emphasis on being responsive to customers and communities places a premium on providers’ information and stakeholder communication strategies – marking a shift from previous arrangements which tended to see the primary audience for ‘performance information’ as the funding agencies. Ofsted and government. There is a wide range of data already available to assist providers, but our research suggests some organisations would welcome support in thinking about how to use and present this information in ways that will help to demonstrate their contribution to their community and strengthen their engagement with economic and social partners. The 2020 Public Services Hub’s recent report15 on a ‘social productivity’ approach to FE and skills argues for the importance of all public services being able to ‘incubate’ and express clearly their social value within communities, and this should be an area for further exploration.

Proposal 2: It would be helpful to develop a spectrum of measures as examples for providers, but ultimately the measures providers select must enable them, with their communities, to assess progress towards and achievement of their goals.

Proposal 3: Sector leaders should consider the new challenges and opportunities of working with an increasingly wide range of public information, and making sense of it with and for different audiences / stakeholders. LSIS could support the sector to explore how information and communication strategies should underpin the shift towards greater community responsiveness.

Proposal 4: The implications of emerging thinking about social productivity could be further explored as a possible guiding framework for generating more meaningful metrics to account for providers’ strategic engagement with and contribution to their communities and localities.

Dynamic engagement with communities In the action research, although some providers focussed on presenting existing data in an accessible format, it was interesting to note that a number of providers took the approach of conducting primary research, and the approach that was most frequently taken was to ask the community for their views of the provider. The providers that took this approach encountered a number of issues including: • The resources involved in carrying out primary research are not insignificant. This relates to the specialist skills required by staff as well as the time this takes and the associated cost.

• At the time the surveys were conducted (autumn 2010) there was a genuine concern amongst providers about raising the expectations of the community when reduced funding meant providers were not sure they would be able to respond to increased demand.

• Not all feedback is positive. For providers committed to closing the feedback loops to their communities, some had concerns about going public with negative feedback, particularly if a provider located close to them was taking a different approach.

14 Evidence on the wider benefits of learning is clear that participating in learning can have positive effects on health and civic engagement www.learningbenefits.net 15 2020 Public Services Hub at the RSA; ‘The further education and skills sector in 2020: a social productivity approach’ www.lsis.org.uk/2020PSH

31Shaping the Community ScorecardAction Research Project

These issues highlight a key emerging challenge for providers about how to discharge their responsibilities for responsiveness, to develop more dynamic engagement with their communities, and empower customers and partners to interact with them to co-design services and programmes. Many of the providers involved in the community scorecard research envisaged presenting their information in a web-based format. The potential of technology to support providers’ information and communication strategies is, we know, being explored across the sector. To give just two examples, providers are using Facebook, amongst other methods, to promote courses, communicate with students, and create online learning communities. Others are trialling ‘Trip Advisor’ style approaches to enabling students to rate and comment on courses.

There is scope for information strategies to go beyond the transactional – the presentation of different types of information and traditional surveys – to harness the interactive potential of technology. In so doing, providers would open up opportunities to gather more systematic market research and intelligence and to use this to further develop deliberative engagement with their communities. If, for example, prospective learners and employers were able to register their interest in courses or programmes that were not currently available, providers would have a way of understanding latent demand and could choose to respond, subject to numbers, costs, availability of staff and resources, etc. Alternatively, the provider could assist with matching – employers with potential recruits or apprentices, or volunteer tutors with interested students – becoming a facilitator of a wider learning community and communities of interest, contributing to the Big Society as well as opening up new opportunities for business development. The responsiveness of this approach could become part of the provider’s overall narrative to both customers and strategic partners, visibly connecting it to the concerns and priorities of the communities it serves.

Proposal 5: Sector providers should explore innovative approaches to developing dynamic engagement with their communities, including how to harness the potential of technology, and the possible economies of scale of working across the sector.

Strategic direction and ownership There was variation in the level of senior leadership involvement across the providers involved in the action research. Where there was senior leadership, the work was more likely to be used to support the strategic planning of the institution. This is perhaps not surprising but does demonstrate the importance of senior leadership if information is genuinely going to be used to account to the community for a provider’s responsiveness, contribution and value – such approaches need to be embedded and made a priority within an institution.

Senior leadership also seems to be important for practical reasons. Firstly, the research identified that it takes time to develop and agree what information should be made public. This was the case where existing information was used as well as where new research was carried out. Therefore sufficient staff time needs to be allocated. Secondly, and related, there may be implications for staff development, both for conducting primary research and for making sense of existing information.

Proposal 6: Sector leaders should consider the implications of the shift to greater community responsiveness and public information for their approaches to accounting for their contribution to the economic and social life of their communities.

32 Shaping the Community Scorecard Action Research Project

Purpose 1. This paper proposes an initial outline of the purpose and scope of the community scorecard for learning and skills sector providers.

2. It aims to:

• locate the community scorecard proposal in the context of The UK Commission’s overall proposals to provide more public information on the performance of the sector;

• suggest how evidence about the role of providers in their communities can inform the development of the scorecard;

• frame the purpose of the community scorecard, suggest the key audiences for it, and some of the benefits that could flow from it; and

• outline an action research project to develop prototypes with colleges and other providers over the summer / autumn, with support from the UK Commission.

Background The Policy Perspective

3. The UKCES’s paper ‘Towards Ambition 2020: skills, jobs, growth’, published in October 2009, makes a series of proposals for ‘building a more strategic, agile and labour market led employment and skills system’. At the heart of these proposals is an intention to “increase trust in, and authority to, learning providers”. Providing more public information on the performance of providers, and increasing their accountability to their communities is seen as one way to do this. The aim is for citizens, communities and employers, and members of local public service boards to be empowered to become far more directly involved in shaping outcomes and improving performance of the employment and skills system, and better informed to make choices.

4. In other public systems, such as local government, health and policing, there have, in recent years, been moves to make information on performance publicly available and to empower citizens and communities to become far more directly involved in shaping outcomes and improving performance. This direction of travel is set to continue as a central feature of public service reform.

5. The UK Commission’s proposals include:

• requiring that all publicly-funded learning providers consult widely and collaboratively with employers and other stakeholders in their community, and use available labour market information to shape annually their mix of provision to meet the needs of their labour market;

• requiring all publicly-funded learning programmes to provide public quality labelling on key outputs and outcomes such as learner success rates, destinations, wage gain, quality and satisfaction levels for learners and employers;

Community Scorecard Annex 1

33Shaping the Community ScorecardAction Research Project

• creating a new and public institutional performance framework for learning providers – a balanced scorecard – based on aggregate outcomes / destinations, satisfaction levels and quality; and

• balancing the institutional performance framework with “evidence of the economic, social and labour market characteristics of the local catchment area”, to allow “for the recognition of post-code effects and distance travelled”.

6. LSIS has an interest in and experience of supporting and encouraging colleges and providers to play a strategic role in supporting their local communities and has agreed to work with the UKCES on the development of a ‘community scorecard’ to address the fourth proposal. The locality perspective

7. A number of recent developments have raised the profile and challenged the practice of learning and skills providers as strategic partners in their local communities. These include:

• the Total Place pilots – LSIS commissioned and worked closely with AoC to explore the involvement of colleges in the pilots;

• the new duty on colleges to promote economic and social wellbeing in their localities – LSIS was commissioned by by the Department for Business, Innovation and Skills to develop guidance for the sector;

• opening up college premises to the community to support informal learning – LSIS was commissioned by BIS to produce a good practice guide;

• tackling and preventing gun, knife and gang-related problems in the sector - LSIS has produced a good practice website to support the sector; and • LSIS’s support programme for equalities, diversity and community development. 8. This work provides a body of evidence from the sector that can inform the development of the community scorecard. It illustrates the breadth of potential community development activities undertaken by providers and indicates that different providers tend to prioritise different dimensions or elements, according to their mission, the scope of their provision, and their local economic and social contexts.

9. Rather than offering the community scorecard as another framework, we will seek to develop it as a spectrum or range of potential ways for institutions to develop their contribution to communities, and consider their reputation and communications within their locality. We want it to be a stimulus for providers to consider how to secure strategic gain and to play their part in local public service coalitions. Ultimately, this means providing more information to their local community, and therefore strengthening local accountability – which is what we consider to be the key purpose of the community scorecard. Purpose of the community scorecard 10. To offer a spectrum of ways for further education providers to articulate and enhance their value to citizens and partners in their local area. This could encompass:

• establishing the fit between the mission of the organisation and local strategic priorities;

• raising awareness of the wider contribution that providers of further education make to securing social and economic wellbeing in their localities;

34 Shaping the Community Scorecard Action Research Project

• communicating the economic and social returns to learning beyond learning outcomes;

• enhancing the accountability of providers to citizens and communities;

• providing a focus for developing leaders of learning and skills as strategic partners in local public service coalitions; and

• providing a stimulus to develop partnerships for better integrated delivery, efficiencies, and improved outcomes for learners. Audiences

11. The key audiences for the community scorecard are likely to be: • local authorities and other public service parners; • citizens, and community and interest groups; • learners,parents and carers;

• employers and employer organisations; and • the media and general public. Benefits

12. For providers, the community scorecard offers an opportunity to:

• gain greater credibility and recognition as a strategic partner with significant local public sector partners; • influence policies, strategies and budgets in the locality;

• raise their profile with key stakeholders and position colleges and providers in wider local education partnerships and as key players in a broad local public service coalition;

• contribute solutions to complex issues that help to secure effective outcomes for learners and the wider community;

• develop college provision to meet local strategic needs; • promote a positive local learning culture, stimulate ambition, and encourage demand for learning;

• secure the contribution of further education to reducing inequality and promoting social mobility, as part of a community partnership;

• demonstrate the wider value of learning to individuals, communities and employers; and

• encourage citizens to take pride in ‘their’ local further education provider.

13. For learners, employers and the community, the community scorecard should:

35Shaping the Community ScorecardAction Research Project

• provide a vehicle for stronger public debate about the role of learning and skills in securing local prosperity and wellbeing;

• offer better information about support and opportunities across learning and skills sector institutions and other public services;

• ensure better alignment of provision with need, working across the public, private and third sectors; and • identify opportunities to secure improved value for money for tax payers through more effective scrutiny of services.

14. For local authorities and public service partners, the community scorecard may present opportunities to:

• harness the leadership capability within the further education sector to help shape local priorities and plans and develop innovative approaches to delivering effective public services; and

• place strategies for transforming lives through learning at the heart of local vision and plans for wellbeing, prosperity, and empowerment. Complexity of local contexts

15. The community scorecard must recognise the diversity in the range and mix of institutional and administrative arrangements in different localities.

16. In terms of institutions, there is a range of FE providers – general further education colleges and sixth form colleges; independent work- based learning providers; specialist colleges; and public and third sector providers, including local authorities. These institutions have a range of missions, institutional foci, learners, provision, kitemarks (e.g. Beacon status), and constitutional / governance arrangements. Within the further education market there is inevitable competition which is likely to increase as resources get tighter and the focus on marketisation increases.

17. Administratively, the range of local government arrangements is also complex. Some areas operate under unitary councils, others with two- tier authorities. Institutions can find themselves serving multiple local authority areas, or a part of a shire county authority. In addition, there are different government arrangements in London (which has a strong regional identity and an elected mayor) and Greater Manchester / Birmingham (recently designated as city regions with formal agreements between participating local authorities and enhanced powers in relation to skills).

18. In addition, spatial, demographic and socio-economic differences inform different local community priorities, a point the UK Commission seeks to capture in their proposal that all publicly-funded learning providers should “consult widely and collaboratively with employers and other stakeholders in their community, and use available labour market information to annually shape their mix of provision to meet the needs of their labour market”.

19. This diversity underlines the necessity to avoid any prescriptive design for the community scorecard.

36 Shaping the Community Scorecard Action Research Project

Rather it needs to provide a spectrum of opportunities from which providers can select the most appropriate mix for their specific context. Design and piloting

20. Our approach is to engage the sector in shaping the development of the community scorecard from the beginning.

21. We propose to invite ‘willing’ providers – from FE colleges and the work-based and ACL sectors – to join a small action research / piloting group.

22. Based on an agreed outline, the focus of the action research / piloting would be to:

• establish a clear narrative about the purpose of the community scorecard; • develop the key elements of the community scorecard spectrum from which providers can select;

• develop a range of impact measures appropriate to providers’ contexts;

• consider how the community scorecard will secure its legitimacy, including the role of trustees, and who will need to be involved in authorising it;

• consider how to use the scorecard, what it will look like, how it will be communicated, and where it might be displayed;

• trial aspects of the scorecard to provide examples of how it could work in practice;

• identify barriers to the effective implementation of the scorecard;

• identify what further support is required for colleges, providers, and other public sector partners to ensure the community scorecard adds value; and

• identify key systems issues for further investigation with the UK Commission and the Department for Business, Innovation and Skills.

37Shaping the Community ScorecardAction Research Project

May / early June July 9 September 1 day seminar September / October / November Early December 1 day seminar Mid-end January February - March

Proposed action research methodology and timescale

Discussion with the UK Commission re:

• scale of projects • agreement on selection criteria. Invitation to become a project Selection of providers Inaugural development seminar to: • discuss purpose, scope, etc • agree specific areas of focus of different projects • action plan processes / approaches. Projects develop practice supported by LSIS associate Second seminar to: • share progress / approaches • agree implications for purpose and scope • capture key lessons • agree options and proposals for taking forward the scorecard Final reports from each project Summative report on the action research Dissemination

38 Shaping the Community Scorecard Action Research Project

Registered Office:Friars House, Manor House Drive Coventry CV1 2TE t 024 7662 7900 e [email protected]

LSIS258

©Learning and Skills Improvement Service 2011 Company number 06454450 Registered charity number 1123636 September 2011