>ofsted: inspecting schools and improvement through inspection

13

Click here to load reader

Upload: george

Post on 10-Dec-2016

217 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: >OFSTED: inspecting schools and improvement through inspection

This article was downloaded by: [Moskow State Univ Bibliote]On: 25 September 2013, At: 16:55Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Cambridge Journal of EducationPublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/ccje20

>OFSTED: inspecting schools andimprovement through inspectionPeter Matthews a & George Smith ba Head of Quality Assurance and Development, OFSTEDb Research Consultant, OFSTEDPublished online: 06 Jul 2006.

To cite this article: Peter Matthews & George Smith (1995) >OFSTED: inspecting schoolsand improvement through inspection, Cambridge Journal of Education, 25:1, 23-34, DOI:10.1080/0305764950250103

To link to this article: http://dx.doi.org/10.1080/0305764950250103

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the“Content”) contained in the publications on our platform. However, Taylor & Francis,our agents, and our licensors make no representations or warranties whatsoeveras to the accuracy, completeness, or suitability for any purpose of the Content. Anyopinions and views expressed in this publication are the opinions and views of theauthors, and are not the views of or endorsed by Taylor & Francis. The accuracy of theContent should not be relied upon and should be independently verified with primarysources of information. Taylor and Francis shall not be liable for any losses, actions,claims, proceedings, demands, costs, expenses, damages, and other liabilitieswhatsoever or howsoever caused arising directly or indirectly in connection with, inrelation to or arising out of the use of the Content.

This article may be used for research, teaching, and private study purposes. Anysubstantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,systematic supply, or distribution in any form to anyone is expressly forbidden. Terms& Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: >OFSTED: inspecting schools and improvement through inspection

Cambridge Journal of Education, Vol. 25, No. 1, 1995 23

OFSTED: inspecting schools andimprovement through inspectionPETER MATTHEWSHead of Quality Assurance and Development, OFSTED

GEORGE SMITHResearch Consultant, OFSTED

ABSTRACT Approximately one-third of all maintained secondary schools in England, and asmaller tranche of nursery, primary and special schools, were inspected under the arrangementsset out in Section 9 of the 1992 Education (Schools) Act by the end of 1994. This articlereviews some aspects of quality assurance in the national inspection system in England. It looksin particular at the improvement of inspection and at improvement through inspection.Continuous improvement of the inspection system reflects feedback from key stakeholders, as wellas work by OFSTED to incorporate new developments. Improvement through inspection,broadly defined, has to cover both what occurs at the individual school level and in the systemat large as a result of inspection. We argue that an effective inspection system can provide apowerful incentive for, as well as directly contributing to, school improvement and development.

BACKGROUND: OFSTED'S ROLE

The 1992 Education (Schools) Act created a new non-ministerial governmentdepartment, the Office of Her Majesty's Chief Inspector, charged with manag-ing and regulating a national system of school inspection by independentinspectors in England. The new department, termed OFSTED (the Office forStandards in Education) is also required to inform the Secretary of State of thequality of education provided by schools, the standards achieved by pupils, theefficiency with which financial resources are managed and the spiritual, moral,social and cultural development of pupils. The 1992 Act and the Education Act1993 also laid down procedures for schools identified during the inspectionprocess as 'failing or likely to fail to give their pupils an acceptable standard ofeducation'.

These requirements set the legal basis for the national system of inspection.Regulations also specified the frequency of inspection; all 25,000 or so main-tained schools in England are to be inspected on a 4-yearly cycle. What shouldbe underlined at this point is the scale of the operation in comparison with whatwent before. At its peak Her Majesty's Inspectorate (HMI) published 350-400surveys or reports each year, of which 150 reports were of full school inspec-

0305-764X/95/010023-12 © 1995 University of Cambridge Institute of Education

Dow

nloa

ded

by [

Mos

kow

Sta

te U

niv

Bib

liote

] at

16:

55 2

5 Se

ptem

ber

2013

Page 3: >OFSTED: inspecting schools and improvement through inspection

24 P. Matthews & G. Smith

TABLE I. Inspection contracts awarded to LEA and independent inspection teams

Percent LEA Percent independent Number ofDate contractor contractor inspections

Autumn 93 77.3 22.7 392Spring 94 62.7 37.3 314Summer 94 52.2 47.8 184

tions. The balance of their work was weighted towards short visits to schools togather evidence on particular aspects of national interest. In contrast, the newsystem is a massive operation, with more than 900 school inspections—more orless the equivalent to an HMI full inspection in scale—conducted in 1993/4,and well over 3000 inspections of nursery, primary, secondary and specialschools planned in the current academic year (1994/5). At its peak this shouldrise to 6000 per year. These inspections are led by registered inspectors (Rgls)and inspection teams working under contract to OFSTED, not by OFSTEDitself.

Since its creation in September 1992, OFSTED has trained and accreditedmore than 1200 primary and secondary Rgls, over 3500 inspection teammembers and 1200 lay inspectors. The highest proportion of the early inspec-tions were conducted by LEA inspection teams, but independent inspectionteams have increasingly been awarded a greater proportion of contracts (seeTable 1). OFSTED has produced and sold over 66,000 copies of the Frameworkfor the Inspection of Schools, which states the inspection requirements; OFSTEDpublished some 60 other publications in 1993/4. At the same time OFSTED haslaid the foundation for its Education Information System (EIS) database, whichcontains inspection reports and other data on all maintained schools.

However we do not intend simply to report the statistics of what has beendone, nor simply to provide an overview of OFSTED's work. This can be foundin HMCI's annual report (OFSTED, 1993a) and other OFSTED publications.Instead, we focus on two key aspects of OFSTED's work, first to indicateOFSTED's commitment to the improvement of inspection procedures, andsecondly, to tackle the challenging question of 'improvement through inspec-tion'. If the first phase of OFSTED's work has been dominated by the need toget the national inspection system up and running, then in the next phase thespotlight will reasonably shift to ways of improving inspection methods and theimpact of the inspection system on school improvement.

INSPECTION AND QUALITY ASSURANCE

Maintained schools, as publicly-funded institutions answerable to a variety ofinterests, are subject to at least three forms of external monitoring and evalu-ation which complement internal procedures such as school self-evaluation and

Dow

nloa

ded

by [

Mos

kow

Sta

te U

niv

Bib

liote

] at

16:

55 2

5 Se

ptem

ber

2013

Page 4: >OFSTED: inspecting schools and improvement through inspection

OFSTED: inspection and improvement 25

staff appraisal. These external mechanisms include not only the regular inspec-tion of the school, but also the reporting formally required of schools and theirgovernors to parents and others and regular financial audit. Inspections aredesigned to evaluate the work of schools and report on their strengths andweaknesses, as well as the contributory factors. Reports identify key issues,which are subsequently addressed by the governors in preparing their post-in-spection action plans.

Inspection is designed to assess whether the school successfully meets itstargets in terms of learning outcomes and pupil experiences. These lie at theheart of quality assurance in schools. Their evaluation requires an emphasis onclassroom practice as well as outcome measures, and it is the emphasis onteaching and learning, directly observing classes and other learning settings,which is the hallmark of inspection methods in the UK. In many other countriesschool accountability measures, like some types of quality accreditation in theUK, tend to focus on management practices and school organisation, ratherthan the ways these are experienced by pupils in the system.

The results of the inspection process have to meet the requirements ofmany different parties, for example, for:

• accountability, to pupils, parents and taxpayers, for the school's provisionand pupils' achievements, as well as value for money;

• compliance with statutory requirements, regulations and duties;• consumer choice, by publishing the reports and summaries for parents to

choose schools, as well as the involvement of parents in the inspection andschool improvement process;

• improvement, where inspection may be used by schools and their gover-nors as both an educational audit and as a management tool to helprefocus priorities for development and targets for improvement.

IMPROVING INSPECTION

We begin with two examples of the way that OFSTED is working to improveinspection methods.

The quality standard for inspections is OFSTED's Framework for theInspection of Schools (OFSTED, 1994a). This sets out principles to whichregistered inspectors and their teams must adhere, a code of practice forinspectors and the detailed schedule which specifies for every aspect inspectedthe criteria against which judgements should be made, the evidence requiredand the features on which inspectors should report. The Framework, as part ofthe Handbook for the Inspection of Schools (OFSTED, 1993b), has received wideprofessional approbation, for example as:

the best book on school management which has ever appeared fromofficial sources. It is a well-polished mirror in which to reflect—and

Dow

nloa

ded

by [

Mos

kow

Sta

te U

niv

Bib

liote

] at

16:

55 2

5 Se

ptem

ber

2013

Page 5: >OFSTED: inspecting schools and improvement through inspection

26 P. Matthews & G. Smith

reflect on—the performance and procedures of all areas of school life.(Dunford, 1993)

There is growing evidence that such inspection documentation is being increas-ingly used as a tool for school self-evaluation and as a vehicle for staff andmanagement development. By October 1993, after only two months of inspec-tion, our evidence suggests that 90% of headteachers, 80% of teachers and 40%of governors in secondary schools had used the Framework extensively toprepare for inspection (OFSTED, 1994b). A third of the heads of schoolsinspected considered the Framework already to have had an impact on schoolmanagement and organisation.

We recognise that there is room for further refinement of the procedures forinspection in the Framework, especially now that primary, nursery and specialschools are included in the programme. OFSTED is committed to the princi-ples, which include: responsiveness to the views of stakeholders; commitment tocontinuous improvement of inspection; full staff involvement in development;and anticipating future needs.

Thus OFSTED invests not only in the quality control and regulatoryaspects of Section 9 inspections, such as training, registration and monitoring,but also in quality assurance measures. These include surveys of inspectors,interviews with governors, school staff, parents and pupils' meetings withrepresentative groups and commissioned research. In this way we know fromvisiting 200 secondary schools, for example, that many secondary schools sawconsiderable advantages in the present system of inspection:

• independent inspection was accepted as essential to the aim of publicaccountability;

• the comprehensive nature of the inspection model defined by the Frame-work was welcomed;

• the clear and open criteria of the Framework were acknowledged as majorcontributions to the objectivity and security of inspections;

• the Framework and Handbook were seen as important management toolsfor various purposes such as school self-review, staff development andappraisal;

• the Framework ensures a fair but rigorous basis for the inspection of allschools.

Inspection will only meet the objectives listed earlier in the section onquality assurance if it remains responsive to the concerns of stakeholders. Thissometimes requires compromise. Some schools, for example, complain aboutthe burden of having to marshal their documentation and complete a pro-formagiving current data about the schools. Registered inspectors on the other hand,need key information and data about the school in order to: carry out documen-tary review in advance; identify possible issues for the inspection; form hypoth-eses to be tested during the time on site; prepare the deployment of the teamand secure judgements. OFSTED's response has been to: specify the key areas

Dow

nloa

ded

by [

Mos

kow

Sta

te U

niv

Bib

liote

] at

16:

55 2

5 Se

ptem

ber

2013

Page 6: >OFSTED: inspecting schools and improvement through inspection

OFSTED: inspection and improvement 27

in which documentation is required, rather than a detailed list; reduce thecontents of the headteachers' forms and provide guidance for their completion;issue further advice to registered inspectors.

Among the concerns to emerge last year were, for example, the stress ofinspection on small primary schools, the question of 'national norms' whendescribing standards, and judging the value for money provided by schools.

We have undertaken several 'action research' projects to address theseproblems. Working with headteachers and registered inspectors, HMI reviewedthe evidence base required to report fairly and securely on the smallest primaryschools. This led to proposals which were piloted in inspections of very smallschools. The schools, however, also contributed to the evaluation of themodified inspection procedures which were subsequently incorporated in therevised Handbook or issued as guidance to primary inspectors. The outcomesinclude: a reduction in the number of inspector days on site; guidance onreducing stress on teachers and on reporting when it may not be possible tocarry out direct observation of all the subjects of the curriculum; and advice onthe management of such inspections.

A further project involved, the development of the methods for judgingvalue for money. A proposed model was discussed with the headteachers andgovernors of 30 secondary schools before being incorporated in the InspectionHandbook. Essentially it involves considering the context of the school andcoming to summative judgements about standards (valued added), quality ofprovision, efficiency and ethos, then setting these judgements against theschool's unit costs. The pilot schools were content with this basis for evaluation,although many raised additional features that they would like to see incorpor-ated in value for money judgements.

In terms of standards of achievement, the ill-advised use of the word'norms' has been replaced by comparators such as 'national averages', wherethese exist, or 'expectations for the age range'. The latest edition of theFramework has also anticipated the Government's encouragement to schools toset their own targets for education and training, reflecting the adoption ofnational targets. Where schools set such targets, these will be reported.

THE DEVELOPMENT OF EDUCATIONAL INDICATORS

Like many other groups, HMI was influenced by the increasing use of perform-ance indicators in education in the late 1980s. A series of HMI working partiesconsidered ways that greater use could be made by inspectors of qualitative dataabout the school, the LEA and the immediate area. This pilot work led directlyto the development of the PICSI (pre-inspection context and school indicator)reports provided for all OFSTED inspections since September 1993. They drawon a database of information on the school, particularly pupil performance dataon GCSE, A level and other public examinations and Standard AssessmentTasks (SATs) where these are available, as well as some LEA and localneighbourhood data. The data sources include 'Form 7 Schools', DfE examin-

Dow

nloa

ded

by [

Mos

kow

Sta

te U

niv

Bib

liote

] at

16:

55 2

5 Se

ptem

ber

2013

Page 7: >OFSTED: inspecting schools and improvement through inspection

28 P. Matthews & G. Smith

ation statistics, the DfE secondary school staffing survey, Section 42 BudgetReturns by the LEA, DoE Revenue Account Returns and the 1991 Census.These data are used to produce a simple interpretive report at the individualschool level, so that comparisons can be drawn with aggregate information forthe LEA, similar types of LEA or national averages. Interpretation has to besimple and indeed partly mechanical. Producing several thousand such reportseach year with a small staff requires the process to be automated down to a fixedset of comparisons, with appropriate supporting text and graphics that can betriggered by the statistical data for the school.

The PICSI report is given to the Rgls prior to the inspection, to bediscussed with the headteacher. Thus unlike some performance indicator sys-tems which are, as it were, the end of the road, the PICSI report is very muchthe starting point, intended to place the school in a broader context. We havealways accepted that such indicators may be fallible in any individual case.There may be data error (at any point from the school's completion of the'Form 7 Schools' through to its entry into the OFSTED system). Inevitably thenational data lag behind the real world. We have improved the rapidity withwhich data are brought into use, but data will always by definition be historic.Thus individual Rgls may well have access to more up-to-date informationdirectly from the school or LEA, though this will lack the national datacomparisons. Finally, it may be that the context data do not apply to the schoolin some way. Thus local neighbourhood data may be inappropriate for selectiveor denominational schools that do not draw from their immediate area. Whileit might be technically feasible to try and identify actual school catchment areas,this is hardly practical with some 25,000 schools. For this reason, the PICSIreports are well-marked with caveats about their use. These caveats underlinethe fallibility of a wholly indicator based system of school assessment.

The PICSI system has been improved and upgraded regularly since it wasintroduced. Since 1993 the system has included 1991 census data for the schoolneighbourhood. In 1994 it proved possible to add the first indicators makingcomparisons between performance at GCSE and A level based on the nationalfigures produced by the DfE. However 'value added' measures are not yetavailable for other age groups.

One clear problem is that though there are data about the context andperformance of the school at secondary level, these are not related in any way.It remains up to the Rgls to use them as they deem appropriate. Gray &Hannon's (1986) analysis suggests that there is considerable potential forinconsistency at this point. How much, if at all, should such factors.be takeninto account? It is one thing to supply the raw material, another to ensure thatit is used in a consistent way.

The importance of a value added approach to assessing school effectivenessis widely accepted. Unfortunately, as yet, such value added possibilities onlyexist at a national level between GCSE and A level. In the long-term we wouldhope to include some form of value added measurement in the PICSI package,though even this apparently simple objective raises a number of questions about

Dow

nloa

ded

by [

Mos

kow

Sta

te U

niv

Bib

liote

] at

16:

55 2

5 Se

ptem

ber

2013

Page 8: >OFSTED: inspecting schools and improvement through inspection

OFSTED: inspection and improvement 29

the assessment data. The prospect of national value added measures coveringthe compulsory phase has receded towards the end of the decade.

In the interim, OFSTED has commissioned a research team at the LondonInstitute of Education to explore how far it is possible to use existing nationaldata to group secondary schools together to ensure, as far as possible, whenassessing school performance and effectiveness, that like is compared with like(Sammons et al., 1994). Making use of a range of background data on thepupils, on the school and its locality and using multiple regression, includingmulti-level techniques, the research team demonstrated that a limited number ofvariables could be used to group schools into broadly similar categories. Thebasic method was to identify a set of variables that best predicted existing GCSEresults. These could then be used to group schools on the basis of theirpredicted results, though of course their actual results varied considerably, apossible indicator of more or less effective schools.

The next stage for OFSTED is to examine how far it will be possible to addthis information to secondary school PICSI reports. These developments showa commitment to improving the technical support provided for inspection. Itmay be a modest step, but it should be remembered that any development hasto work on a very large number of schools in a relatively short time space andwithout overloading an already demanding schedule for inspection.

IMPROVEMENT THROUGH INSPECTION

The second of our major sections deals with the challenging question ofimprovement through inspection. Some of those critical of OFSTED's pro-gramme come close to asserting that inspection cannot of itself bring aboutimprovement; that quality control is by definition not quality assurance. Clearly,OFSTED's programme is only one of a series of changes that have taken place,including the National Curriculum and its assessment, changes in the organis-ation and management of schools and the 'failing schools' policies, to name buta few. Disentangling the impact of these different changes is likely to be thesubject for research for many years to come. Whether OFSTED's programmecontributes to school improvement has to be established, rather than simplyasserted. However, there is growing evidence that the introduction of theinspection system, together with statutory post-inspection action planning, ismaking a contribution to school improvement.

We examine the impact of OFSTED's programme first at the level of theschool and then at the national level.

(i) School Level

From surveys of secondary schools, we know that significant benefits of inspec-tion already include:

Dow

nloa

ded

by [

Mos

kow

Sta

te U

niv

Bib

liote

] at

16:

55 2

5 Se

ptem

ber

2013

Page 9: >OFSTED: inspecting schools and improvement through inspection

30 P. Matthews & G. Smith

• the value of having an external audit of achievements, strengths andweaknesses, providing information for parents and accountability for theexpenditure of public money;

• the growth in confidence and morale resulting from affirmation of aschool's quality and direction;

• the major impetus provided to focus thinking on aspects of the schoolwhich did not meet the Framework criteria and its power to act as a catalystto accelerate policy review and staff development;

• the identification of areas for improvement, although some inspectionreports still need to make these more clear.

There is much evidence that preparation for inspection results in schoolimprovement, but evidence is beginning to mount that inspection itself isalready having an impact. We know, for example, that the majority of schoolsinspected in autumn 1993 have not only met the statutory requirements inproducing their action plans and responded in terms of the guidance includedin DfE Circular 7/93 (DfE, 1993). Action plans must be produced within 40days of the end of the inspection, circulated to all parents and have addressedthe key issues in inspection reports. Many of the more straightforward items inearly action plans have been implemented; for example amended curriculumand assessment policies, improvement in management and administration,revised syllabuses and schemes of work and the provision of specific equipmentand materials for learning. However, few governing bodies have worked out howthey will monitor the achievement of planned targets. More than a quarter ofschools have incorporated inspection issues into their staff development pro-grammes. A smaller number of action plans have also set out practical strategiesto tackle more complex matters, such as the improvement of teaching andraising standards. OFSTED has monitored a 10% sample of the action plans ofschools inspected during the first year. Many of these schools have been visitedby HMI, who have evaluated:

• the fitness for purpose of the plan and whether or not it met the statutoryrequirements;

• the quality of the planning process;• the capacity of the school to implement the plan;• the initial impact of the inspection and the action plan on improving

quality and standards.

OFSTED is also concerned to evaluate the implementation of action plansover a longer period. This involves a longitudinal study of the sample of schools,resulting in evaluation of progress made within a period of 18 months from thepublication of the report. New improvement indicators have been developed. Inthe case of failing schools, or those having serious weaknesses, some evidence ofprogress is sought after a period of 6 months. One weakness in action planningis the difficulty many schools experience in establishing clear indicators orcriteria for success against which their progress may be monitored.

Dow

nloa

ded

by [

Mos

kow

Sta

te U

niv

Bib

liote

] at

16:

55 2

5 Se

ptem

ber

2013

Page 10: >OFSTED: inspecting schools and improvement through inspection

OFSTED: inspection and improvement 31

(ii) National Level

It is sometimes assumed that 'improvement through inspection' is restricted tothe school level; that OFSTED's impact in other words, should be seen only interms of its impact on the school inspected. This is clearly too limited andignores the second important part of OFSTED's responsibility, which is to usethe information collected through inspection to provide advice to the Secretaryof State and draw attention to issues of educational concern.

This was also traditionally the HMI role, continued in such reports asAccess and Achievement in Urban Education (OFSTED, 1993c), which unques-tionably helped place education in disadvantaged urban areas firmly back on thenational agenda. Work since then has focused on ways of improving educationin such areas, not just by OFSTED, but many other groups.

The Educational Information System (EIS), containing both text and dataon schools based on school inspection as well as existing statistics on all schools,has been up and running for some months. It now contains data from the firstyear of secondary inspections. This potentially allows a far wider trawl ofmaterial on which to base advice on the current state of play, involving datafrom all schools plus the 25% or so that will have been inspected in the presentyear.

Following up the main theme of the Access and Achievement in UrbanEducation study, we present some preliminary data drawn from the first year ofsecondary inspection data, which throws some light on the problems of school-ing in disadvantaged areas. A better understanding of the current strengths andweaknesses of schools in such areas must be the starting point for a programmeof improvement.

We have simply drawn on a few of the summative judgements thatinspectors were required to make on each school covering a range of keycharacteristics. These were in the form of a seven point scale, running from'excellent' to 'very poor'. We have broken the scale into the top three ('fa-vourably judged') and bottom three ('unfavourably judged') scale points. Forcomparison we have presented data separately for schools in the highest two andlowest two social contexts, also based on the inspectors ratings. Figures 1-3show something of the way that inspectors ratings vary sharply, not just betweenschools in different areas, but by the different school characteristics rated.

Thus Fig. 1, based on the first year of secondary inspection, shows thesharply different ratings inspectors gave to schools in the most and leastadvantaged areas in terms of overall standards of achievement. Figure 2 showsthe same pattern, but in much less extreme form, when overall standards arerated in relation to pupils' capabilities. Finally, Fig. 3 shows the patternreversed. In terms of the availability of accommodation (access to additionalspace), schools in the most disadvantaged areas received a better rating.

These are but three examples from an extensive set of ratings. Much moredetailed analysis will be required to draw out the patterns of strengths andweaknesses of schools in different areas.

Dow

nloa

ded

by [

Mos

kow

Sta

te U

niv

Bib

liote

] at

16:

55 2

5 Se

ptem

ber

2013

Page 11: >OFSTED: inspecting schools and improvement through inspection

32 P. Matthews & G. Smith

100-

90-

80-~ 70~^ 60-

* 50-CC 40-

30-20-

10-

0Social context 1-2 Social context 6-7 All social contexts

Favourably judged (%) Unfavourably judged (%)

FIG. 1. Inspector rating of schools by social setting. Standards of achievement in relation to nationalaverages: KS3.

LOOKING AHEAD

How should inspections in England develop in the future? There is cause foroptimism in the continuing improvement of the standard of inspection. Monitor-ing of the conduct of inspections and the quality of reports has begun to weedout the few Rgis who are falling short of the requirements of the Framework.The standard of written reports on secondary schools has risen during 1993/94.Many inspectorates and contractors are adopting increasingly efficient and

90-

80-

70-

•o 50 -S£ 40-]

30-

20-

10-

0Social context 1-2 Social context 6-7 All social contexts

["""'] Favourably judged (%) | | Unfavourably judged (%)

FIG. 2. Standards of achievement in relation to pupils' capabilities: KS3.

Dow

nloa

ded

by [

Mos

kow

Sta

te U

niv

Bib

liote

] at

16:

55 2

5 Se

ptem

ber

2013

Page 12: >OFSTED: inspecting schools and improvement through inspection

OFSTED: inspection and improvement 33

Social context 1-2 Social context 6-7 All social contexts

\ 1 Favourably judged (%) | j Unfavourably judged (%)

FIG. 3. Availability of accommodation: KS3.

effective approaches to the establishment of teams and the management ofinspection. Furthermore, contracts are awarded only to bidders who meet a highquality threshold, not simply to the lowest tender. The track record of Rgls istaken into account in assessing their tenders.

OFSTED has under active consideration other ways of refining the inspec-tion system to enhance its contribution to school development and improve-ment. It is concerned with such questions as:

• how should the inspection process reflect changes in the National Curricu-lum, assessment and testing?

• how effective is the Framework for the inspection of primary schools?• how can the burden on schools and bureaucracy for inspectors be further

reduced whilst maintaining the need for a sound evidence base to securejudgements?

• what is the scope for introducing greater recognition, coupled with exter-nal validation, of schools' own evaluation?

• how can the cost-benefit of inspection be maximised?

Improvement, though, means change. There are some risks inherent in alldevelopment, particularly in the case of inspection, which relies on complex andnot always clear-cut market forces, some of which are in tension. Modifying theFramework, however slightly, introduces perturbation into the inspection mar-ket. Revising the Handbook for Inspection has implications for the many pur-chasers of the previous edition. OFSTED will not only continue vigorous effortsto consult interested parties, but has also established a national inspectionadvisory group to consider solutions to some of these issues. Better qualityinspection must mean a better basis for school improvement.

Dow

nloa

ded

by [

Mos

kow

Sta

te U

niv

Bib

liote

] at

16:

55 2

5 Se

ptem

ber

2013

Page 13: >OFSTED: inspecting schools and improvement through inspection

34 P. Matthews & G. Smith

Correspondence: Dr Peter Matthews, OFSTED, Alexandra House, 29-33Kingsway, London WC2B 6SE, UK.

REFERENCES

DEPARTMENT FOR EDUCATION (1993) Circular No. 7/93(London, HMSO).DUNFORD, J. (1993) Managing the School for Inspection (London, Secondary Heads Association).GRAY, J. & HANNON, V. (1986) HMI's interpretation of schools' examination results, Journal of

Education Policy 4, pp. 23-33.OFSTED (1994a) Framework for the Inspection of Schools (London, OFSTED).OFSTED (1994b) A Focus on Quality (London, OFSTED).OFSTED (1993a) Standards and Quality in Education, 1992-93: the annual report of HMCI

(London, HMSO).OFSTED (1993b) Handbook for the Inspection of Schools (London, HMSO).OFSTED (1993c) Access and Achievement in Urban Education (London, HMSO).SAMMONS, P., THOMAS, S., MORTIMORE, P. et al. (1994) Assessing School Effectiveness: developing

measures to put school performance in context (London, OFSTED).

Dow

nloa

ded

by [

Mos

kow

Sta

te U

niv

Bib

liote

] at

16:

55 2

5 Se

ptem

ber

2013