Keywords

Introduction

In response to a decline in literacy and numeracy standards, from 2012 all schools in Ireland are required to engage with school self-evaluation (SSE) using a Department of Education and Skills (DES) inspectorate devised school self-evaluation framework (DES 2012a) and to ‘produce three-year improvement plans for numeracy, literacy and one aspect of teaching and learning across all subjects and programmes’ (DES, Ireland 2012b, p. 2). The purpose of SSE is described by the Chief Inspector of Ireland who states: ‘our ultimate goal is for schools to conduct their own evaluations transparently and accurately and for inspectors to visit these schools to evaluate the school’s own self-evaluation’ (Seomra Ranga 2012). This policy direction represents a major departure from the primacy of external inspection which was the cornerstone of school evaluation in Ireland before 2012. However, far from being a radical initiative confined to Ireland, it is in fact closely aligned to broader international trends in school evaluation policy.

The chapter begins by briefly placing this policy shift in its wider context and then focuses on its implementation to date in the Irish school system. The chapter provides a documentary analysis of the changing landscape of school inspection and SSE policy and practice in Ireland from 1998 to the present. Then drawing on interview data and a national survey of post-primary school principals in Ireland, the main section will explore attitudes towards the creation of what has now become a mandatory culture of SSE.

The Changing Relationship Between School Inspection and SSE

In an evaluation paradigm shift, schools in many countries where inspection exists are being given greater autonomy to put in place local mechanisms to improve the quality of education offered. However, ‘in return for this autonomy, schools are being required to evaluate their own educational quality and to come up with their own plans for improvement’ (Vanhoof and Van Petegem 2007, p. 102). Theoretically this policy shift is justified, at least in part, by a related and increasingly widely accepted notion (see Ehren et al. 2013; Gray 2014) that inspection models are adapted when education accountability systems mature. Schools and their stakeholders develop self-evaluation literacy and innovation capacity to improve education on their own and thus have a diminished need for top-down inspections and reform initiatives.

With the widespread introduction of SSE and the interrelated drive for data-driven school self-regulation, it would be reasonable to suggest that inspectorates of education are facing a changed landscape. Grek et al. (2013) point out that while inspection is not new ‘the contexts in which it currently operates greatly extend the demands upon it, and require attention to how it bridges the regulation/self-evaluation “gap” in different national and local settings’ (p. 488).

One solution to bridging this ‘regulation/self-evaluation’ is through the development of inspectorate-devised SSE frameworks that try to accomplish the following: (1) counterbalance the increased autonomy afforded to schools; (2) ensure the validity and reliability of the schools internal evaluations; (3) enable central government to have a comparative picture of the quality of education in a country and to ascertain whether a particular policy is working or not; and (4) ensure that the improvement initiatives of each school are in line with the collective educational reform initiatives of a country or region. The overarching logic for this mode of evaluation co-existence is described by Donaldson (2013, p. 11) who states:

The powerful relationship between external and internal evaluation is central to stimulating improvement. Each can make a particular contribution, but the synergies arising from the combination of the two can bring particular benefits. Inspectorates are increasingly emphasising the importance of effective self-evaluation as a driver of improvement. But self‐evaluation can become self-delusion or worse and must operate within a framework of accountability which both encourages its rigor and validates its authenticity.

It is in this broader international policy context that recent developments in Ireland must be understood.

The Re-birth of School Inspection in Ireland

Since the Education Act of 1998 (Government of Ireland 1998) Ireland has experienced profound changes in its school evaluation arrangements. Despite a history dating back into the early 19th century, school inspection at post-primary level had almost ceased to exist. However, in the Education Act, school inspection was for the first time in the history of the state, put on a legislative footing. ‘The functions of an Inspector shall be: to support and advise recognised schools, centres for education and teachers on matters relating to the provision of education…’ (Government of Ireland, Education Act 1998, section 13 (3))’.

With what might be described as a nearly blank canvas, the inspectorate set itself ambitious targets around the recruitment of inspectors who were charged with the development and implementation of multi-mode inspection frameworks and SSE instruments. The inspectorate was also tasked with ensuring that school inspection would once again become an accepted part of the system as the process had become a largely unfamiliar concept to the majority of post-primary school principals and teachers in Ireland. In this regard the revival of inspection has been achieved. It has become once more a regular feature of school life and according to the DES (2013), ‘between 2011 and 2012 inspections of some type occurred in 93% of second-level schools’ (p. 22).

Although from 1998 to 2012 the immediate evaluation priority for the Irish education system was external inspection, SSE was none the less recognised by the inspectorate as a complementary and essential component of school improvement.

Ireland, along with other European countries, is adopting a model of quality assurance that emphasises school development planning through internal school review and self-evaluation with the support of external evaluation carried out by the Inspectorate. (Department of Education 2003, p. viii)

As a result, the inspectorate developed, Looking at our schoolsan aid to self-evaluation in second-level schools (LAOS) (DES 2003). Paralleling the external inspection framework LAOS set out five self-evaluation themes: school management, school planning, curriculum provision, learning and teaching in subjects and support for students. It was expected that a school would choose a theme on which to focus and using available evidence would grade the chosen theme along a four-point continuum from significant strengths to weaknesses outweigh strengths. Although there was no requirement for schools to engage with LAOS, and little evidence to suggest that schools had the skill set required to gather and analyse data to any significant degree (see McNamara and O’Hara 2005, 2012), LAOS did serve one significant purpose, namely a closer alignment between internal and external concepts of quality. According to Brown (2011), ‘LAOS is used in some schools for the purpose of gathering evidence in preparation for school inspection’.

However in practice, inspectors did not require schools to provide evidence that self-evaluation formed a significant part of the school development planning process. This is confirmed by the DES who stated that, ‘recognising that the more impact-focused, school improvement-focused approach of SSE was one with which many schools were not yet familiar, inspectors did not generally apply SSE expectations to the planning processes of schools during the inspections they undertook’ (2013, p. 40). In 2012 however, SSE evolved quite suddenly from being a largely rhetorical concept to a very real imposition on schools and teachers. SSE moves centre stage

In 2012 the inspectorate produced a comprehensive set of guidelines for SSE (DES 2012a, b, c, d), the purpose of which was described by the then Minister for Education as follows:

The School Self-Evaluation Guidelines will support schools to evaluate their own work and to set targets to improve teaching and learning. This will help to achieve the targets set out in the Programme for Government and in the National Literacy and Numeracy Strategy, launched by the Minister last year. (Quinn Nov 19, 2012)

The reasons for this rapid change of policy can only really be guessed at. The Inspectorate would point out that self-evaluation had been part of school evaluation since 2003 and while more honoured in the breach than the observance was always likely to be stepped up at some stage. Also as mentioned above the increased role accorded to SSE in Ireland was in line with similar developments elsewhere and it is very evident in recent years that inspectorates are now working more closely together and are heavily influenced by new policies and practices in other countries. Finally, the theory and practice of inspection becoming more indirect in the sense of being concerned primarily with overseeing SSE as opposed to conducting hands-on inspections is no doubt welcome in the context of limited resources and falling numbers of inspectors.

It now became a requirement for all schools in Ireland to engage with SSE in the manner prescribed in the self-evaluation framework. Moreover the framework was quite prescriptive both in the areas to be evaluated and the methodology to be used. The latter included the statistical analysis of the results of state examinations, scores from standardised tests which are now compulsory, attendance and early leaving data and surveys of both parental and student opinion. The framework also urged the use of management and peer review of teaching, a very controversial procedure in Ireland. All this data was to be used to develop a short but specific improvement action plan, including clear targets, in the area under evaluation, for example literacy standards.

To ensure that all schools would engage with the self-evaluation process, DES Circular Nos. 0040/2012 (DOE 2012a) and 0039/2012 (DOE 2012b) required all schools to conduct self-evaluations starting in the academic year 2012/2013. Moreover, the process of self-evaluation must be in accordance with the inspectorate-devised school self-evaluation guidelines (DOE 2012b, c). Furthermore, for a number of reasons, such as Ireland’s ‘Pisa shock’ in 2010, Circular Nos. 0040/2012 and 0039/2012 also required school self-evaluations to focus on literacy, numeracy, or an aspect of teaching and learning, and ‘in subsequent years, schools should select again from the above options so that, within the four-year period, a School Self-Evaluation report and a three-year school improvement plan (SIP) for literacy, for numeracy and for one aspect of teaching and learning across all subjects will be completed’ (2012a, p. 3). Figure 4.1 provides a sample time-line for SSE during this period.

Fig. 4.1
figure 1

Sample time line for school self-evaluation (2012–2016)

Not surprisingly the suddenly increased emphasis on SSE came as a considerable shock to schools which had been used to rhetoric but no real requirement to self-evaluate. This development came in the middle of a deep economic downturn which had seen resources cut sharply including the non- replacement of senior and middle management staff. Moreover, although schools had always had data available, its use in any systematic way was very limited, as were research skills and experience. To give context to the responses of schools reported later, the next section looks briefly at some of the challenges faced in implementing the new SSE regime from 2012 onwards.

Issues Concerning School Self-Evaluation Implementation

According to Cheng (2010, p. 985), when establishing mechanisms for school development evaluation, consideration should be given to ‘teachers understanding of planning and how to collect the evaluation data and its supporting sets, otherwise, the failure possibility will be increased’. Criticisms relating to lack of internal evaluator capacity have been highlighted by McNamara and O’Hara (2008, p. 175) among others. They point out [when referring to Elliot’s (1995) research on self-evaluation] that the self-evaluation movement that was popular in the late 1970s and early 1980s declined because ‘neither training, experience or professional culture had allowed teachers to develop the discursive consciousness necessary to become reflexive, self-aware and thus able to self-evaluate’. Although it is perhaps arguable that inspectorates, as full-time professional evaluators, have been trained adequately and have the necessary skills to systematically collect and analyse qualitative and quantitative data, it is unwise to assume that this is the case with school personnel.

Indeed, school personnel’s lack of capacity to carry out meaningful and worthwhile evaluations mirrors the assertion of Vanhoof et al. (2010, p. 2) that internal evaluators, such as school principals, ‘are usually not trained in carrying out research, collecting data, data management or data interpretation’. The resulting absence of information-rich environments and inadequate evaluation skills inevitably leads to valuable information either being neglected or mistreated, culminating in what Blok et al. (2008, p. 387) refer to as, ‘an armchair analysis without any empirical evidence’. Furthermore, school evaluation has been conceived out of an experts-based professional model, which ‘creates tensions for these novice, school-based evaluators who meet their teaching responsibilities while being expected to attain at least some professional evaluator skills and knowledge—often with minimum support’ (Ryan et al. 2007, p. 208).

Research also suggests that the absence of post evaluation support has a debilitating effect on schools’ motivation to become engaged in an active discourse for improvement (see Macbeath 1999; McNamara and O’Hara 2005; Vanhoof and Van Petegem 2007). It, therefore, appears that continuous support is required for schools to move from complying with rudimentary evaluation tasks to utilising a more practitioner research-based approach to improvement.

In addition, the required level of support clearly depends on the complexity of the proposed improvement actions. For example, it appears that the actions needed to improve areas, such as literacy levels, school attendance, behaviour and parental engagement, are extraordinarily more complex in schools where there is a high proportion of social deprivation. MacBeath (1999, p. 152) asserts that ‘there is a continuum of support that is needed, with very little or none at one end to strong and sustained support and intervention at the other’.

The Inspectorate in Ireland have recognised these issues, and tried to enable SSE capacity building and support as an essential component for the creation of a dual culture of evaluation. For example, all Principals are now provided with in-service training on the rudiments of the process including training on data analysis. Support also comes in the form of very detailed guidelines and a dedicated website (www.school-selfevalution.ie). As an inspector put it, ‘so, we need a lot of systems in place, and we need up-skilling of staff in schools in order to ensure that internal evaluation will work to everybody’s benefit’ (Brown 2013, p. 129).

Nonetheless concerns remain. For example, one might question, whether, even with the increasing availability and use of quantitative data, how valuable in practice data of this type will be in forming evaluation judgements. For example, referring to value-added assessments used in Tennessee schools in North America, Amrein-Beardsley (2008) in reference to Morgan (2002) states, ‘confusing data reports and a lack of training for teachers and administrators in how to understand the data reports were preventing schools and teachers from using value-added data to improve student learning and achievement’ (p. 67). Similarly, Heritage and Yeagley (2005, p. 333) are also of the view that ‘misinterpreting data or relying on a single, often unreliable, data point to make crucial decisions may be even more detrimental to a school and its students than having no data at all’.

As a result, lack of internal evaluator capacity can negatively affect levels of trust between inspectorates and schools where issues relating to the reliability and validity of self-evaluation reports are concerned. As Terhart (2013) states, ‘without an adequate support and training system the managerial, data-driven approaches to raise the quality of teaching will have no effect and will therefore not work where it is most needed’ (p. 494).

As indicted above the new centrality accorded to SSE in school evaluation policy and practice only dates back to 2012. The research reported below is the first systematic attempt to ascertain the responses of schools to this initiative and to identify the mechanisms needed to be put in place to ensure that SSE potential is achieved.

Research Design

This study used a multi-phase convergence research design consisting of three distinct phases. Each phase of the research consisted of concurrent levels that were sequentially aligned to provide an overall interpretation of the study.

  • Phase oneDocument Analysis

This phase of the research consisted of a documentary analysis of SSE policy and practice in Ireland from 1998 to the present.

  • Phase twoDevelopment of Questionnaire and Interview schedule

This phase of the research consisted of the development of a questionnaire and interview schedule using a modified version of Bushnell’s (1990) conceptual framework for evaluating training (Fig. 4.2).

Fig. 4.2
figure 2

Net worth and sustainable commitment to evaluation activities (source Brown et al. 2013, Fig. 3)

  • Phase threeData Collection and Analysis

An online questionnaire and cover letter explaining the ethical considerations and purpose of the research was emailed to all principals in Ireland (n = 732). The questionnaire response rate was 351, representing 48% of the total population of post-primary principals in Ireland. All questions, although interrelated, were classified according to their location in the input/process/output/outcomes system model (Fig. 4.2). From this, descriptive statistics were used to provide a broad interpretation of principals’ perceptions towards mandatory SSE in Ireland.

The next stage consisted of carrying out a series of semi-structured interviews with a sample of principals (n = 24) to gain a greater understanding of the questionnaire responses. As was the case in this study, Patton states, ‘the purpose of a stratified purposeful sample is to capture major variations rather than to identify a common core, although the latter may also emerge in the analysis’ (Patton 2002, p. 240). Selection of participants was based on a stratified purposeful sampling strategy based on an equal distribution of principals from the various school types that exist in Ireland.

Finally, Creswell’s (2008) data analysis process and Miles and Huberman’s (1994) ‘Components of Data Analysis: Interactive Model’ were used for interview coding and analysis. It was then possible to converge interview data with the other phases of the study to provide an overall interpretation of the research.

Presentation and Analysis of Data

Input

This section presents the input phase of the evaluation cycle. The first subsection describes principals’ perceptions of available and future resources required for SSE. Leading on from this, the second subsection ascertains participants’ perspectives on the extent to which members of the school community have the capacity to carry out SSE effectively.

Resources

As is illustrated in Table 4.1, more than 60% of principals in Ireland believe that existing resources provided by the DES are useful for SSE with one principal stating that ‘the departments SSE website [www.school-evaluation.ie] has been good to see what other schools are doing’ and another principal stating that ‘the templates to analyse Leaving Cert results have been excellent to see how we compare to other schools’. On the other hand, although DES resources are seen as being useful for SSE, more than 85% of principals are of the view that there is still a need for more resources which is also in line with the view of the DES SSE advisory group who also state ‘The resources required to engage with SSE must be provided—otherwise there is a danger that it becomes a box ticking exercise’ (DES 2015a, b, p. 3).

Table 4.1 Descriptive statistics: SSE resources

From an analysis of qualitative data and given SSE infancy in Ireland, there appears to be a need for more SSE case studies from schools coupled with resources on how to use assessment data for target setting. Moreover, almost all principals are of the opinion that there is an overwhelming need to provide the necessary human resources to expedite fully the potential for SSE.

According to one principal: ‘They don’t have to be perfect, but it would be good to see real case studies where a school shows what they did right and what they did wrong with some literacy or numeracy strategy and then we would know what’s needed’. It appears therefore that the benefits of more case studies available to schools would also be a closer alignment between internally presumed and externally assumed notions of SSE, which would inevitably lead to self-organisation in schools. ‘Self-organizing is the process by which people mutually adjust their behaviours in ways needed to cope with changing internal and external environmental demands’ (Cilliers 1998 cited in Anderson et al. 2005, p. 673).

Issues surrounding the lack of assessment resources were also of particular concern to most principals. A recurring theme that was evident in all of the qualitative data was the lack of assessment tools and resources for key SSE areas such as Literacy and Numeracy. According to one participant, ‘we need more resources and funding for WRAT and CAT (standardised tests) and how to analyse the data. Another principal stated, ‘more resources and simple-to-use instruments would be helpful. A bank of measures of numeracy and literacy across the board for all schools would be useful’. Another principal also stated, ‘statistical analysis and SMART Target Setting are not adequately embedded’.

Regarding a generic set of tools being available to schools, although 75% of principals were favourable to these resources; they were also of the belief that the tools provided should be adaptable to the school context, particularly when issues of assessment relating to various socio-economic groups are concerned. As one principal states, ‘not all schools have the same resources, the same type of student cohort’. However, according to the Chief Inspector, ‘adjusting test results for socioeconomic factors is a disputed science and in any case, it is a most expensive process—one we could certainly not afford readily at present’ (Hislop 2012, p. 22). None the less, for those who are proponents of the use of contextual data, it appears that for the moment, many post-primary schools are left with no alternative but to provide a context-free account of how they compare with every other school in Ireland regardless of the various antecedent variables such as socio-economic status that affect student test scores.

However, by way of contrast, Gorard (2010) questions the emphasis placed on value-added measures of quality when referring to the school effectiveness model in the United Kingdom. The author states that if policy makers had a greater understanding of the limitations of value-added measures they might begin to question the usefulness of the ever-increasing dominance of school effectiveness models more generally and look towards more valuable processes than interpretations of test scores.

Finally, the human resources required to conduct SSE activities was identified by almost all principals as a significant barrier to embedding a culture of SSE in schools. According to one principal, ‘when you consider that Literacy, Numeracy, SSE were all rolled out at the same time with very few resources provided it caused a lot of stress, frustration and annoyance’. Another principal also stated, ‘I believe that SSE is very worthwhile, and the SIP is focussed on the needs of the individual school. However due to lack of resources, time, available personnel, most principals and their key staff are spending long hours after school on this work to the point of exhaustion’.

Capacity

Table 4.2 illustrates that 55% of principals believe that their staff have the necessary skills required to conduct SSE. However, although 70% of principals believe that staff at their school have the capacity to analyse quantitative data, and almost 60% of their staff have the capacity to analyse qualitative data, a significant majority of principals are also of the view that principals, deputy principals and teachers require more training to carry out SSE. Only a minority of principals are of the opinion that staff at their school have the necessary training needed to conduct peer reviews.

Table 4.2 Descriptive statistics: SSE capacity

Analysis of quantitative data suggests that the majority of principals are of the view that staff at their school can analyse quantitative data and that the results from externally devised tests should be used as part of the SSE process (Table 4.3).

Table 4.3 Descriptive statistics: the use of external data for SSE

On the other hand, it is striking that there appears to be an underlying need for training in various aspects of assessment, in particular for the purposes of target setting and questionnaire development. According to one principal, ‘most schools have someone on their staff who can plug in the numbers and press the button. I think you can do this quite easily but to do it properly you need quite a bit of expertise’. As one principal states, ‘we are better at analysing data rather than devising questionnaires’. Indeed, according to another principal, ‘more training is needed for the full staff on how to find relevant information and set realistic targets. Reviewing targets is very time-consuming. The process needs to be constantly developed in schools with workshops etc. available to all staff’.

The issue of staff being able to use qualitative data for SSE was an area of training that also required attention. ‘Training needs to be provided to ensure that staff feel confident to do this’. However, another principal also questions the practicalities and trustworthiness of, for example, the use of focus group data. ‘Focus group! More nonsense. When would I get time for focus groups? Real world. Give me the staff and we will have focus groups. Anyway, picking the right focus group will get the result you want. What other area of politics or public service listens to focus groups?’ This perspective resonates with another principal who stated, ‘The Inspectorate appear not to be in favour of Qualitative Data as per a recent advisory visit’.

The use of peer review for SSE also appears to be at an embryonic phase. A small minority of principals are of the view that staff at their school have the necessary skills required to adequately conduct peer review (Table 4.2). As stated by one principal, ‘I would like more support in this regard and a whole school approach and a systemic approach so that all staff are getting the same input’. Indeed, in schools where peer review does take place, another principal stated, ‘I have done it, but I know it would be a more effective exercise with training’.

It is not surprising therefore that only a small minority of schools use peer review as part of their SSE process (Table 4.4) which is also in line with DES (2015b) figures that also suggests that only 28% of post-primary schools use team teaching or peer review as a source of evidence for SSE.

Table 4.4 Descriptive statistics: the use of peer review for SSE

Apart from lack of training in this area, there were other reasons given as to why peer review is not used as part of the SSE process. Peer review is perceived by many teachers as a form of internal accountability as opposed to a set of inspirational means within the SSE process. As one principal bluntly put it, ‘there would be a revolution I fear’. Indeed, the following principal statement in many ways provides a summary analysis of factors relating to the infrequent use of peer review in the SSE process, ‘more training needs to be provided. This is an area of contention with some teachers feeling under threat by peer review by a Principal or Deputy Principal. A system by which teachers review each other might be more acceptable and productive with teachers learning best practice from their colleagues’. As one principal whose school has engaged in the process states, ‘I have looked at models used abroad and tailored them for our own school and ethos. This exercise must begin gently with a bit of carrot to get it over the line’.

Process

This section presents the process phase of the evaluation cycle. The first subsection describes principals’ perceptions of SSE standards. The second subsection ascertains participants’ perspectives on the extent to which SSE processes and guidelines are understood.

Standards

Tables 4.5 illustrates that almost 60% of principals are of the view that schools should use the same methods and procedures to carry out SSE and 63% of principals are of the opinion that schools should use the same SSE processes. However, analysis of qualitative data also reveals that many principals are reticent that schools should use the same SSE methods and processes.

Table 4.5 Descriptive statistics: SSE standards

Almost all principals that were interviewed were of the opinion that schools should decide what methods and procedures to use for SSE. Indeed, while principals believe that there should be a minimum quality threshold for SSE, the majority are also of the view that, when frameworks of quality indicators are provided externally, they should also be adaptable to the context and culture of the school. A common view in reference to the DES guidelines for SSE (DES 2012a, b, c, d) was that ‘the aim of DES policies and initiatives must reflect the needs of individual schools, particularly in relation to the school context. Appropriate self-evaluation should, therefore, be employed to meet individual schools needs and there should be flexibility for schools in relation to the methods and procedures used in order to allow for the diversity of school types/sizes/settings/staff profile/pupil profile, where the pupils are from, etc.’

Finally, Table 4.6 illustrates that 73% of schools have a set of procedures for carrying out SSE and almost 30% of schools have an SSE policy.

Table 4.6 Descriptive statistics: SSE policies and procedures

On the one hand, some principals stated that they were in the early stages of developing an SSE policy (‘we followed the SSE guidelines to date. Now we have ownership of these and will create our own school policy and procedures’). On the other hand, reasons were given by principals as to why their schools do not have an SSE policy at all. Some Principals were of the view that it was too early in the SSE cycle to formulate an official SSE policy where according to one principal, ‘being a relatively new process, and given that getting to the stage we are at now has taken considerable time, there hasn’t yet been an opportunity to develop a formal school policy on the matter’. Some other principals questioned the value of school policies more generally. According to one principal, ‘I am not sure we want to have to write and develop yet another policy’. Another principal agrees, ‘policies constrain creativity and inventiveness and are used too much to tick boxes. Policies are only useful if they are being used and agreed by all stakeholder’s’.

Although very few schools in Ireland have SSE policies, what is striking however is the number of schools that now have a systematic set of procedures to carry out SSE? Prior to the introduction of DES SSE guidelines, a national survey by Brown (2013) found that only 26% of schools in Ireland had a set of SSE procedures. However, this value has now quite extraordinarily increased to 73% in this short period. Indeed, almost all principals that were interviewed stated that they now used SSE procedures contained in the DES guidelines, albeit with varying degrees of success. As stated by one participant ‘we have followed the SSE guidelines’. Another principal also states that the procedures used in his school are ‘advised through Department documentation and in-service training’.

Accessibility

Table 4.7 shows that almost 53% of principals are of the view that the process of SSE is easy to understand. Moreover, almost 46% of principals believe that the SSE guidelines developed by the inspectorate are easy to understand.

Table 4.7 Descriptive statistics: accessibility

Regarding clarity of SSE processes, one principal states, ‘it is a document that is a snapshot of our school at a particular time and so is easy to follow and to understand’. Another principal suggests, ‘it’s not rocket science!’ Indeed, having engaged in the process since SSE introduction in 2012, many other principals are of the view that SSE processes have become more coherent. One principal states that: ‘any type of change is never easy to understand at first. As the time has gone on and we have had more experience of this whole process our understanding has improved’. Another principal is also of the view that, ‘as we move along the process it is clearer and easier to understand. I suppose it is becoming an integral part of how we operate’.

On the other hand, a considerable minority of principals have struggled with the SSE process where according to one principal, ‘it is over complicated and difficult to understand’. Another principal concurs, ‘I feel that the Six Step model being used by the DES is overly complicated, and the same results could be achieved through action research’.

Principals’ perceptions of the SSE guidelines ranged from the lack of time available to engage with guidelines (‘the guidelines are fairly easy to understand but too long. Time is of the essence in an extremely busy environment’) to the fact that the guidelines were difficult to comprehend in the initial stages of SSE. As one principal puts it:

The 88-page book was (is) too long… It took us a long time to pare it down to its essential components and identify in a simple format the process as it should emerge from beginning to end (i.e. ‘developing a school self-evaluation report and working towards a school improvement plan).

On the other hand, many participants are of the view that the SSE guidelines are a valuable asset to embedding a culture of evaluation in their schools having engaged in the process, and having spent time reading the guidelines. For example, one principal states, ‘the Guidelines take getting used to. After leafing through it regularly, the sections and the language become familiar. It is logical and systematic in how it’s laid out but like anything, I guess, it takes time to figure out how it works’. Another respondent summarised it thus,

In my experience, I felt I did not have a good understanding of SSE until I set aside a significant amount of time to read, reflect on and discuss the guidelines with staff. This involved reading the DES book, using the website, meeting with staff regularly to discuss and view other schools work.

Output

This section presents findings relating to the output phase of the evaluation cycle. The first subsection describes the extent to which various members of the school community are engaged in the SSE process. The second subsection describes participants’ perspectives on the public availability of SSE reports.

Participation

Table 4.8 illustrates that principals in Ireland are of the view that the majority of principals, deputy principals and teachers conduct SSE on a regular basis in their schools, and SSE involves all staff.

Table 4.8 Descriptive statistics: frequency of SSE

According to one principal, ‘we have been doing SSE in different ways for many years. The new structure makes it better’. On the other hand, although quantitative data suggests that schools in Ireland carry out SSE on a regular basis, it is apparent that the practice of self-evaluation in the majority of cases is through informal individual evaluations, ‘teachers do it every day but not with all the gathering of evidence, and paperwork involved in the Department process’. One critic added, ‘we mainly do it because we have to, not because we currently see that it is very effective in improving teaching, learning and management in our school’.

Moreover, it also appears that in some cases, SSE seems to be a once off annual event as opposed to a continuous, systematic whole school process. One principal describes the regularity of SSE in the school as follows, ‘in our heads, we are evaluating every day. On official documents—once a year’.

Apart from the issue of time the most significant challenge for SSE is to encourage staff to evaluate from a SSE as opposed to an individual evaluation perspective. ‘Teachers always aim to improve their practice and regularly reorganise and change in order to meet the needs of the varying pupils they meet. However, the challenge is to get teachers to think from a whole school perspective. This is much more challenging and difficult for teachers’.

Transparency

Regarding the transparent aspect of SSE, Table 4.9 shows that a minority of principals (37%) agree or agree strongly that SSE reports should be published on the Internet. However, this figure is higher than DES (2015a, b) figures that suggest that 22% of schools published the SSE report on their website. None the less, this figure is also in line with DES (2015a, b) figures that also suggests that, at post-primary level, 37% of schools provided a summary SSE report and 39% of schools provided an SSE improvement plan to their community.

Table 4.9 Descriptive statistics: accessibility

Some principals stated that they had already begun to publish SSE reports, ‘we make them available to parents, but I often wonder who reads them’. Another principal also states: ‘ours is there. I am not sure who reads them!’ In contrast a principal suggests that the publication of SSE reports has the potential to highlight successes and move away from league tables, ‘this is great. An opportunity to highlight successes and to show a plan in progress. We have to move away from state examination based validation only. It has a place, but we have to become confident in our own choices, explain them, show our method of self-examination and show how we will move forward’.

On the other hand, the vast majority of principals were not in favour of SSE reports being publically available. Reasons given for not making SSE plans or reports available were many and ranged widely:

  • Having a disproportionate effect on non-selective schools where according to one principal: ‘This will suit schools that cherry pick students and whose results reflect that’.

  • The trustworthiness of reports where reports could be altered to mask areas for improvement in the school. ‘Each school would be pressurised to put up outstanding work rather than honest work to impress future students’.

  • The belief that SSE reports should be internal to the school. As stated by one principal: ‘SSE should be just that, for internal use and reflection only. DES inspections are and should be published’.

  • The value place on inspection reports in comparison to SSE reports. According to one principal: ‘parents value external evaluation, self-evaluation is discredited by them and lacks value. Self-evaluation is a process we all undergo in every walk of life, but it is an assessment that is subjective and open to abuse’.

It appears, therefore, that the requirement for SSE reports to be publically available is seen as a negative step in the SSE process and in many ways is in line with Perryman (2009) who highlights the dilemma facing most evaluation systems that require schools to publish SSE reports for accountability purposes.

The problem with self-evaluation documents produced for evaluation is that an honest warts-and-all approach is simply not possible. Over-emphasise strengths, and a school could be criticised for complacency with a management team unable to plan for progress, but identify too many weaknesses, and there is a risk of giving a skewed picture which may influence the judgement of the inspectors negatively. (2009, p. 621)

Outcomes

This section presents findings related to the perceived impact of SSE on management, teaching and learning. Table 4.10 shows that a significant majority of principals are of the view that SSE results in better teaching, learning and management.

Table 4.10 Descriptive statistics: outcome of SSE

Although many principals had positive dispositions towards the SSE process, some principals were also of the view that to concur with DES (2015a, b), ‘SSE is currently viewed as a chore—something that has to be done’. In particular, some principals were of the opinion that, while recognising the benefits of SSE, the process was too formulaic. As stated by one principal: ‘It results in paperwork, lots of time and box-ticking. The theory is great, but actually, this is taking from the spontaneous good work being done in good schools and restricting us to a formula for ever-increasing obligations. The result: Much box-ticking’. Another principal is of the opinion that, ‘Self-evaluation definitely leads to better everything but creating a further paper trail just adds more to an already burgeoning workload and instead of being a welcome addition to school life it becomes another chore that has to be done to satisfy somebody else’.

On the other hand, some principals were also of the view that SSE results in better management, teaching and learning. One principal is of the opinion that the requirements of SSE have led to staff members becoming more reflective of their practice, ‘it has helped in reflective practice in a busy school environment in looking at what’s working well in our school and what needs to improve or develop further’.

Similar statements were also used to describe how SSE has improved certain quality aspects of education in interview participants’ schools, analysis of exam results is open and transparent, and now it is not regarded as personal criticism of teachers if areas for improvement are highlighted’ and another noted, ‘this current phase works as we have seen improvements in Literacy and Numeracy as a consequence of SSE implementation’ In a similar vein it was suggested that, ‘there is a lot more discussion on teaching and learning techniques at all subject levels’ and again ‘the focus on numeracy and literacy has prompted teachers to share good practice and to reflect on their own methodologies’.

Based on these statements, it appears that as a result of the SSE process there has been an increase in collaboration, reflective practice and dialogue evaluation. As one principal states, ‘If teachers work through the process and own the SIP it will inevitably lead to better teaching and improved outcomes for pupils’.

Given that SSE has only become a mandatory requirement since 2012 (DES 2012a, b, c, d), some principals were of the view that it is too early to say if the SSE process has resulted in improved outcomes. As stated by one principal, ‘again I think that it is too early to say just yet’. Another principal agrees, ‘it is too early to say whether this process results in better teaching and learning but by using strong evidence in our work is a support to staff in reviewing areas of teaching and learning’.

Unintended Consequences

Table 4.11 illustrates that a majority of principals are of the view that there are no unintended consequences as a result of their schools engaging in SSE. Furthermore, Table 4.12 also reveals that there were positive and negative indirect effects as a result of SSE engagement.

Table 4.11 Descriptive statistics: unintended consequences of SSE
Table 4.12 Descriptive statistics: indirect effects of SSE

Analysis of qualitative data reveals that a number of principals are of the view that there has been a considerable increase in stress as a result of their schools engagement with SSE for a variety of reasons such as the increased workload required of SSE; the feeling that mandatory SSE was untimely; the lack of available resources required of SSE and the perceived need to formalise all SSE activities.

For most principals, as a result of new SSE guidelines, there has been a considerable increase in workload for schools, ‘from a principal’s point of view it takes up a huge amount of time, planning for self-evaluation, researching, drawing up plans, getting consensus to implement them and follow up’. Another principal also states, ‘it is an extra workload. I do not know if it would stand up well to a time-benefit analysis’.

Many principals also feel that the introduction of mandatory SSE was untimely given the decimation of school resources and the parallel introduction of other initiatives resulting in what one principal refers to as ‘initiative overload. As noted by one principal:

Pressures from recently introduced and then changing initiatives, reduction of middle management positions and reduced salaries are resulting in a teaching cadre that is much more cautious about tackling further change. Most worryingly I would include my motivators in this category, despite wonderful work continuing in this area.

Principals were also of the view that there were many positive unintended consequences culminating from their schools engagement with SSE such as:

  • An increase in professional dialogue among staff. According to one principal, ‘SSE has resulted in a more professional dialogue between staff’. Another principal similarly suggests, ‘there has actually been some time spent at staff meetings talking about teaching and learning rather than about the usual “discipline” issues’.

  • Increased collaboration across sectors. As stated by one principal, ‘SSE has resulted in increased collaboration between Primary and Secondary Schools during the Transfer process with perhaps a greater awareness among Second Level Staff of testing at Primary Level’.

  • An increased understanding of what is required for inspection. According to one principal, ‘if you read the quality statements in the SSE guidelines you can be pretty sure what they are looking for during an inspection. Before we were doing a lot of guess work on what they were looking for’. Another principal added, ‘I can now support my staff more easily by telling them what to focus on for the inspection’.

  • A greater sense of collegiality among staff. According to one Principal, ‘there is far more collaboration between staff and that no one will do it for us’. Another principal concurs, ‘there are now more subject discussions, more confidence in teachers’ self-belief and less isolation of a teacher in a classroom with no backup’.

Many principals were also of the view that SSE engagement had indirectly resulted in leadership being less hierarchical in their schools. One principal states, ‘I think it has brought us together as a staff. It has also allowed leaders emerge from the staff group based on ability rather than age. It is really interesting to see young teachers taking a lead and helping older teachers engage with ideas’. Another principal puts it this way, ‘funnily enough it has allowed individual staff to shine in ways that they have not up to now. Those with Masters and those who know about research have come to the fore’.

Finally, principals were also of the view that another indirect effect of SSE engagement is that there has been a significant increase in distributed leadership. According to principal, ‘there is great staff engagement now. Subcommittees have started up. Distribution of leadership is now very evident’. Another principal also states, ‘now that staff has bought into the process it has helped to forge a strong team effort in the school’. Indeed, having engaged in the process, one principal summarises:

At the initial stages, it probably caused a bit of unease as we were trying to come to terms with it. What it has helped is in getting the whole staff to work together. Ownership has been appreciated by the teachers involved. What we need to do now is to take it step by step and work at our own pace.

Discussion

In 2012 evaluation of schools in Ireland underwent a significant change. SSE moved to the center of the process and external inspection, which had been the dominant mode until that time, was recast as being largely about inspectors assuring the quality of schools self-evaluation activities. A detailed framework for SSE was promulgated. This required schools to make use of a wide range of quantitative and qualitative data to develop improvement plans including specific targets. This greater emphasis on SSE was in alignment with developing policy in several other countries.

For the vast majority of schools this was a new and demanding requirement, involving data collection and analysis, planning and target setting of a more specific and detailed type. Previous research had indicated that few schools had the resources or expertise to undertake this type of work and there was some trepidation about the new departure in inspection policy. The Inspectorate did provide a degree of training and support for schools commencing with SSE in the format now required.

The research reported in this chapter represents the first large-scale attempt to investigate how SSE has worked out in practice in the schools since 2012. The research involved a survey of all of the principals of second level schools in Ireland, followed by interviews with 24 of them.

The outcome of this research is by and large positive and encouraging. Although, as might be expected, most schools report the need for greater resources and support, the majority believe that the SSE framework and the training and support provided by the Inspectorate has been helpful. A surprisingly high proportion of school leaders believe that staff have the necessary skills to conduct evaluation, although a number of areas, in particular peer review of teaching and target setting in areas such as literacy and numeracy, require additional development. The framework and guidelines produced by the Inspectorate are widely used and around three quarters of the schools surveyed report engaging in active SSE.

A minority of principals did raise some negatives including the overly prescriptive nature of the framework leading to a lack of freedom for individual schools to follow differing needs. A more troubling criticism made by some respondents was that other stakeholders, particularly parents, tended to regard SSE as being unlikely to be critical and suggested that external inspection reports would be taken more seriously. Some respondents also suggested that the process tended to be perceived as an annual event to be completed rather than an ongoing aspect of school life.

These criticisms were relatively minor in comparison to positive outcomes pointed to by informants in this research. While some principals suggested that it was too early to evaluate the effectiveness or otherwise of SSE others listed a significant array of progress and improvement resulting from the process. Among these were the focus on specific areas such as literacy and numeracy leading to clear targets and measurable improvements in achievement, more open discussion around the outcomes of public examinations and what these results say about the performance of schools, more collaborative work among teachers and signs of genuine distributed leadership in schools.

However, there was a common view that these gains were at the expense of a considerable increase in both stress and workload, together with a degree of initiative fatigue, which led some respondents to argue that while SSE seems to be worthwhile it was difficult to envisage that schools could continue to put so much effort into it without specific resources being made available.

Finally, the impact of these changes on the role and functioning of the Inspectorate has been considerable. SSE before 2012 had largely been a rhetorical requirement and the focus of school evaluation was on the traditional inspection of individual teachers and the newer concept of whole school evaluation involving teams of inspectors. However, it had become clear, even before the economic collapse that this form of inspection would result, due to limited resources, in unacceptably long gaps between inspection treatments. Moreover, it was emerging internationally that most inspectorates were moving towards more focused, risk- based forms of inspection and away from cyclical visitations. Part of this process, internationally, was the growing emphasis on SSE, transferring responsibility for improvement in standards and monitoring of performance away from centralised inspection and onto the schools. The shift in policy in Ireland in 2012 mirrored these trends.

In effect, therefore the role of the Inspectorate in this new era becomes more indirect. The task now is to set parameters, standards and methodologies for schools to take responsibility for quality control while the Inspectorate quality assures the resulting processes. However, it is still probably too early to say whether or not, in the longer term, schools can find the capacity and willingness to self-evaluate in a systematic and robust manner and whether, even if they do so, this approach can deliver a satisfactory level of accountability and the public confidence that external inspection has managed to attain.