institutionalised ict use in primary education: a multilevel analysis

10
Institutionalised ICT use in primary education: A multilevel analysis Ruben Vanderlinde a, b, * , Koen Aesaert a , Johan van Braak a a Department of Educational Studies, Ghent University, Henri Dunantlaan 2, B9000 Ghent, Belgium b Research Foundation Flanders, Belgium article info Article history: Received 11 June 2013 Received in revised form 8 October 2013 Accepted 10 October 2013 Keywords: ICT use Primary education Multilevel analysis abstract This study uses a multilayered framework of different independent school and teacher variables to study which factors are related to the use of ICT for teaching and learning in Flemish (Belgium) primary schools. Special attention is paid to widely accepted technology uses by teachers, which is labelled as Institutionalised ICT use. A questionnaire has been administered to a representative teacher sample (N ¼ 433) in 53 Flemish primary schools. Factor analyses and multilevel hierarchical regression analyses have been conducted. The results of the multilevel analysis show that Institutionalised ICT useshould not only be considered as a teacher phenomenon but also as a school phenomenon. The null model shows that about 14% of the variance in ICT use of teachers is due to between-school differences. In a nal model, the variables ICT professional development, ICT competences, developmental educational beliefs, and schoolsICT vision and policyshowed a positive association with Institutionalised ICT use. Ó 2013 Elsevier Ltd. All rights reserved. 1. Introduction Within the context of the knowledge society, teachers and schools try to make use of ICT in their practices to improve students’‘twenty- rst century skills(Anderson, 2008). ICT integration for teaching and learning is becoming a major task for primary schools all around the world (Vanderlinde, van Braak, & Hermans, 2009). In this context, ICT is perceived as a tool to help students master the skills required for using information and communication systems (Anderson, 2008), to foster self-regulated learning strategies (Karabenick, 2011), to change the interaction within the classroom and to involve people outside the classroom in pupilslearning activities (Harris, in Anderson, 2002). ICT can support the use of a constructivist approach to teaching and learning (Niederhauser & Stoddart, 2001). ICT and technology are seen as a golden key in facilitating technology-enhanced and student-centred learning environments (Hannan & Land, 1997). Given these optimistic voices and given the positive impact that ICT can have on student learning (e.g. BECTA, 2007), researchers are searching for factors situated on different levels (e.g. student, teacher, school, and policy) that best support the use of ICT for teaching and learning (Cox, 2008). This is one of the key questions of the ICT integration research tradition (e.g. Kozma, 2003a). 1.1. ICT use ICT use is at the heart of several studies in the ICT integration literature (e.g. Afshari, Bakar, SuLuan, Samah, & Say, 2009; Meneses, Fàbregues, Rodríguez-Gómes, & Ion, 2012; Tondeur, van Braak, & Valcke, 2007). ICT integration means that ICT is used in education to foster teaching and learning processes. Its transformative nature (Watson, 2006) is stressed referring to the use of ICT as a lever for instructional change (Vanderlinde & van Braak, 2010). In the ICT integration literature, ICT use has been measured and operationalised in many different ways, both quantitatively and qualitatively (e.g. de Koster, Kuiper, & Volman, 2012). The qualitative paradigm has been very popular in international comparative research designs on the use of ICT in classrooms and the curriculum. See for instance, the SITES M2 case studies on this topic reported in Kozma (2003a) and Voogt and Pelgrum (2005). This study ts within a quantitative research paradigm on ICT integration. Within this paradigm some researchers report the amount of ICT use in the classroom, the time students spend working * Corresponding author. Department of Educational Studies, Ghent University, Henri Dunantlaan 2, B9000 Ghent, Belgium. Tel.: þ32 9 264 86 30; fax: þ32 9 264 86 88. E-mail address: [email protected] (R. Vanderlinde). Contents lists available at ScienceDirect Computers & Education journal homepage: www.elsevier.com/locate/compedu 0360-1315/$ see front matter Ó 2013 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.compedu.2013.10.007 Computers & Education 72 (2014) 110

Upload: johan

Post on 23-Dec-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

Computers & Education 72 (2014) 1–10

Contents lists available at ScienceDirect

Computers & Education

journal homepage: www.elsevier .com/locate/compedu

Institutionalised ICT use in primary education: A multilevel analysis

Ruben Vanderlinde a,b,*, Koen Aesaert a, Johan van Braak a

aDepartment of Educational Studies, Ghent University, Henri Dunantlaan 2, B9000 Ghent, BelgiumbResearch Foundation Flanders, Belgium

a r t i c l e i n f o

Article history:Received 11 June 2013Received in revised form8 October 2013Accepted 10 October 2013

Keywords:ICT usePrimary educationMultilevel analysis

* Corresponding author. Department of EducationaE-mail address: [email protected] (R.

0360-1315/$ – see front matter � 2013 Elsevier Ltd. Ahttp://dx.doi.org/10.1016/j.compedu.2013.10.007

a b s t r a c t

This study uses a multilayered framework of different independent school and teacher variables to studywhich factors are related to the use of ICT for teaching and learning in Flemish (Belgium) primaryschools. Special attention is paid to widely accepted technology uses by teachers, which is labelled as‘Institutionalised ICT use’. A questionnaire has been administered to a representative teacher sample(N ¼ 433) in 53 Flemish primary schools. Factor analyses and multilevel hierarchical regression analyseshave been conducted. The results of the multilevel analysis show that ‘Institutionalised ICT use’ shouldnot only be considered as a teacher phenomenon but also as a school phenomenon. The null modelshows that about 14% of the variance in ICT use of teachers is due to between-school differences. In afinal model, the variables ‘ICT professional development’, ‘ICT competences’, ‘developmental educationalbeliefs’, and ‘schools’ ICT vision and policy’ showed a positive association with ‘Institutionalised ICT use’.

� 2013 Elsevier Ltd. All rights reserved.

1. Introduction

Within the context of the knowledge society, teachers and schools try to make use of ICT in their practices to improve students’ ‘twenty-first century skills’ (Anderson, 2008). ICT integration for teaching and learning is becoming a major task for primary schools all around theworld (Vanderlinde, van Braak, & Hermans, 2009). In this context, ICT is perceived as a tool to help students master the skills required forusing information and communication systems (Anderson, 2008), to foster self-regulated learning strategies (Karabenick, 2011), to changethe interaction within the classroom and to involve people outside the classroom in pupils’ learning activities (Harris, in Anderson, 2002).ICT can support the use of a constructivist approach to teaching and learning (Niederhauser & Stoddart, 2001). ICT and technology are seenas a golden key in facilitating technology-enhanced and student-centred learning environments (Hannafin & Land, 1997). Given theseoptimistic voices and given the positive impact that ICTcan have on student learning (e.g. BECTA, 2007), researchers are searching for factors– situated on different levels (e.g. student, teacher, school, and policy) – that best support the use of ICT for teaching and learning (Cox,2008). This is one of the key questions of the ICT integration research tradition (e.g. Kozma, 2003a).

1.1. ICT use

ICT use is at the heart of several studies in the ICT integration literature (e.g. Afshari, Bakar, SuLuan, Samah, & Say, 2009; Meneses,Fàbregues, Rodríguez-Gómes, & Ion, 2012; Tondeur, van Braak, & Valcke, 2007). ICT integration means that ICT is used in education tofoster teaching and learning processes. Its transformative nature (Watson, 2006) is stressed referring to the use of ICT as a lever forinstructional change (Vanderlinde & van Braak, 2010). In the ICT integration literature, ICT use has been measured and operationalised inmany different ways, both quantitatively and qualitatively (e.g. de Koster, Kuiper, & Volman, 2012). The qualitative paradigm has been verypopular in international comparative research designs on the use of ICT in classrooms and the curriculum. See for instance, the SITES M2case studies on this topic reported in Kozma (2003a) and Voogt and Pelgrum (2005). This study fits within a quantitative research paradigmon ICT integration. Within this paradigm some researchers report the amount of ICT use in the classroom, the time students spend working

l Studies, Ghent University, Henri Dunantlaan 2, B9000 Ghent, Belgium. Tel.: þ32 9 264 86 30; fax: þ32 9 264 86 88.Vanderlinde).

ll rights reserved.

Fig. 1. Research model.

R. Vanderlinde et al. / Computers & Education 72 (2014) 1–102

with ICT, or the use of specific computer applications. For instance, Smeets (2005) measured both the frequency of ICT use during classes, aswell as the frequency of use of specified types of ICT applications (e.g. word processing, e-mail, etc.). Baylor and Ritchie (2002) state thatmany researchers operationalise ICT use in terms of a basic dichotomy whereby ICT is either used as the subject of study or as aninstructional tool to teach other content. Niederhauser and Stoddart (2001) focus on the use of educational software, making a distinctionbetween ‘skill-based transmission software’ and ‘open-ended constructivist software’. Nowadays, researchers do not consider ICT use as amonolithic process, but emphasise that ICT can be integrated in many different ways in classrooms. In this sense ICT use is studied as a morecomplex phenomenon referring to different types of use. For instance, Tondeur et al. (2007) make a distinction between three types ofcomputer use: (1) the use of ICT as an information tool, (2) the use of ICT as a learning tool, and (3) learning basic computer skills. Baylor andRitchie (2002) describe nine different types of ICT use, including “the use of ICT for subject-matter content”, “the use of ICT for collabo-ration”, and “the use of ICT for higher order skills”. Meneses et al. (2012) speak about professional use making a distinction between‘supportive’ use and ‘management use’. The first is linked to classroom preparation activities like finding supplementary information forlessons. Management use is related to teachers’ general duties in functioning of schools as organisations (e.g. performing administrativetasks, communicating with colleagues, interaction with parents and students). Although the research community has developed manydifferent instruments to measure ICT use, these studies furthermore report that the actual use of ICT for teaching and learning is ratherlimited (e.g. Niederhauser & Stoddart, 2001; Smeets, 2005; Tondeur et al., 2007), and these results compel the search for factors andconditions that explain under which circumstances ICT can fulfil its transformative nature. Scholars researching ICT integration from aqualitatively perspective, make similar conceptualisations of ICT integration by also stressing its complex and multi-faced nature. de Kosteret al. (2012) for instance, make a distinction in (1) ICT tools available to and used by teachers and/or pupils, (2) goals with which these toolswere used, and (3) activities performed with these tools.

The study reported in this article takes another viewpoint by only paying attention towidely accepted uses of ICT by teachers. This meansthat – in contrary to ‘traditional’ quantitative research designs in the ICT integration literature – this study has a distinctive focus towards fullimplementation and integration of ICT in classrooms. This viewpoint will be further described as ‘Institutionalised ICT use’. The approach isunique as it could be characterised as a reversedway of working. In ‘traditional’ designs, researchers focus on all kinds of ICT use (from low tomoderate and intensive use), and then look for factors associatedwith these different forms of ICT use. The approach presented in this articlesolely considers those types of ICT use that are commonly adopted in classrooms by teachers, and gives insight in which factors are

R. Vanderlinde et al. / Computers & Education 72 (2014) 1–10 3

associated with ICT use that is highly accepted and integrated by teachers into their classrooms. Identification of these factors is essential,since their manipulation can take less integrated kinds of ICT use for teaching and learning to a higher frequency of use in the classroom.

2. Research objective

The general research objective of this study is to identify which factors are associated with ‘Institutionalised ICT use’, or ICT that has beenfully implemented towards teaching and learning. This study investigates how several general and ICT related school and teacher factors areassociated with the use of ICT in Flemish (Belgium) primary education. These are the independent variables of the study, and are selectedfrom the e-capacity framework (see Fig. 1) of Vanderlinde, van Braak, and Tondeur (2010). Preceding this objective, a valid and reliableinstrument that measures the ‘institutionalised’ ICT use of teachers has been composed. This developed scale will serve as the dependentvariable of the study.

3. Methodology

3.1. Research context

The study is situated in Flanders, the Dutch speaking part of Belgium. In 2007, the Flemish Government administered a compulsory ICTcurriculum to primary schools. The ICT curriculum is written in terms of ICT attainment targets or minimum objectives, which describe theICT knowledge, skills, and attitudes viewed by the government as necessary for and attainable by most students in compulsory education.The ICT attainment targets do not focus on technical skills, but emphasise the integrated use of ICT within the teaching and learning process(Aesaert, Vanderlinde, Tondeur, & van Braak, 2013; Vanderlinde et al., 2009). The Flemish Government expects that schools implement theICT curriculum into practice and translate the broadly formulated ICT attainment targets into concrete teaching and learning activities. ICTuse in Flanders is thus becoming more prominent on the educational agenda of policy makers, and in the practices of teachers.

3.2. Participants

Data were drawn from a sample of 433 school teachers in 53 primary schools in Flanders. All participants are evenly distributed acrossthe grades 1–6 in the 53 primary schools. The sample was 85% female, the age ranges from 22 to 61 years old, with an average age of 39. Onaverage, teachers reported that they have used a computer for approximately 13 years at home and 9 years in the classroom.

3.3. Dependent variable

This study has a unique perspective in measuring ICT use for teaching and learning by focussing on institutionalised or widely acceptedICT use. This means that – contrary to other studies – this study solely pays attention to ways of ICT use that are frequently used by teachersin their classroom (see above). To do this, therefore, for this study, a new dependent variable ‘Institutionalised ICT use’was constructed (seeDescriptives and scale reliability) based upon four existing and widely used scales. To put differently, this new variable is based on existingscales measuring teachers’ actual use of ICT in their classroom practice (Vanderlinde & van Braak, 2010), by distinguishing four types of ICTuse (Hermans, 2009; Tondeur et al., 2007):

� The use of ‘basic ICT skills’ (B-ICT, 4 items).� ‘ICT as a learning tool’, referring to the use of ICT to support pupils’ learning (LEAR, 5 items).� ‘ICT as an information tool’ referring to the use of ICT to select, retrieve, and present information (INF, 7 items).� ‘Innovative ICT use’ (INNO, 18 items) assesses primary teachers’ innovative use of ICT for educational purposes.

Items of the teachers’ actual use of ICT scales have a frequency Likert-scale answer format (i.e., 0 ¼ never, 1 ¼ every term, 2 ¼ monthly,3 ¼ weekly, and 4 ¼ daily).

3.4. Independent variables

The selection of independent variables is based on a revision of the e-capacity framework of Vanderlinde and van Braak (2010). Thisframework was developed from a school improvement perspective, and consists of factors and conditions fostering the integration of ICTinto teaching and learning practices. Central to this framework is the e-capacity of a school, which refers to the schools’ ability to create andoptimise school- and teacher-level factors and conditions to bring about effective ICTchange. These factors have been translated into reliableand valid measurement scales (see Vanderlinde & van Braak, 2010), and are clustered into four mediating subsets of variables: ICT relatedteacher conditions, general teacher conditions, ICT related school conditions, and general school improvement conditions. The subsets ofvariables illustrate the multilayered nature of conditions affecting ICT integration (see Fig. 1). They are all embedded within the broadergeneral and ICT related context of the educational policy system (see for instance Yuen, Law, Lee, & Lee, 2010), as we know that ICTclassroompractices are influenced by state or national policies in areas such as curriculum and assessment, professional development, and tele-communications (Kozma, 2003b).

The first layer of variables refers to ICT related teacher conditions. In the e-capacity framework, two endogenous conditions are putforward; the relevance of ICT knowledge and skills and ways of acquiring them (see also Granger, Morbey, Lotherington, Owston, &Wideman, 2002). More concretely, Vanderlinde and van Braak (2010) present two measurement scales:

� The ‘teachers ICT professional development’ (4 items) scale assesses the extent to which teachers keep up with developments in the fieldof ICT integration, like taking part in in-service teacher training programmes.

R. Vanderlinde et al. / Computers & Education 72 (2014) 1–104

� The ‘teachers’ ICT competences’ (5 items) scale measures the degree to which teachers find themselves competent in integrating ICT intotheir classroom practice.

The second layer of conditions contains general teacher conditions. These are conditions described in the school improvement literatureas conditions fostering implementation. The last two measurement scales refer to teachers’ educational beliefs on the nature of good ed-ucation. Research shows that educational beliefs have an impact on implementing educational innovations in general (Van Driel, Bulte, &Verloop, 2007), and ICT in particular (Hermans, Tondeur, van Braak, & Valcke, 2008).

� ‘Teacher efficacy’ (12 items) refers to the ‘teachers’ judgement of his or her capabilities to bring about desired outcomes of studentengagement and learning’ (Tschannen-Moran & Woolfol Hoy, 2001). The teacher efficacy scales contain the short version of the scaledeveloped by Tschannen-Moran and Woolfol Hoy (2001).

� Innovativeness is defined as a personal attitude towards the adoption of an innovation; the ‘innovativeness’ (5 items) scale of Van Braak(2001) measures teachers’ willingness to change.

� The ‘developmental beliefs’ (9 items) scale determines to what degree education should be oriented towards broad and individualdevelopment, be process oriented with an open curriculum, and to what degree knowledge should be acquired through acquisition(Hermans, van Braak, & Van Keer, 2008).

� The ‘transmissive beliefs’ (9 items) scale assesses the extent towhich respondents believe education serves external goals and is outcomeoriented with a closed curriculum. It also evaluates to which extent knowledge acquisition is perceived as being most adequatelyachieved through transmission (Hermans, van Braak et al., 2008).

The third layer of variables refers to ICT related school factors: This includes a range of organisational features or local conditions thataffect ICT integration. Vanderlinde and van Braak (2010) constructed three scales measuring these conditions:

� The ‘schools’ ICT vision and policy’ (7 items) scale assesses (a) the extent to which a school has a clear vision on the place of ICT ineducation, and (b) the extent to which a school has a policy and policy plan containing different elements concerning the integration ofICT in education.

� The ‘ICT infrastructure’ (4 items) scale assesses – from a teacher perspective – the availability, satisfaction, and appropriateness of the ICTschool and classroom equipment (i.e., hardware, software, and peripheral equipment).

� The ‘ICT school support and coordination’ (7 items) scale assesses the degree to which ICT integration is coordinated at the school leveland the extent to which ICT support is arranged within the school.

The fourth layer of variables refers to conditions described in the school improvement literature as contributing to the implementationand realisation of educational change. Vanderlinde and van Braak (2010) include four of these school improvement conditions in their e-capacity framework:

� The leadership scales of Hoy and Tarter (1995, 1997) contain the ‘supportive leadership’ (5 items) and ‘initiating structure’ (7 items) scale.The first scale measures efforts to motivate teachers by using constructive criticism and setting an example through hard work. At thesame time, the school leader is helpful and genuinely concerned with the personal and professional welfare of teachers. The secondscale is related to task and achievement oriented leadership behaviour. The school leader makes his or her attitudes and expectationsclear and maintains definite standards of performance (Hoy & Tarter, 1995, 1997).

� The ‘professional relations among teachers’ (7 items) scale measures the level of communication and cooperation between teachers(Staessens, 1990).

� The ‘participation in decision making’ (5 items) scale of Geijsels et al. (2001, 2009) measures the extent to which teachers believe thatthey participate in processes and outcomes of the schools’ decision making around issues of education, innovation, and schoolimprovement.

All items of the independent variables have a Likert-scale answer format ranging from 0 (totally disagree) to 4 (totally agree). The itemsare presented in Vanderlinde and van Braak (2010).

3.5. Analysis

The 433 teachers (level 1) of this study are nested in 53 schools (level 2). In order to take the corresponding hierarchical structure ofnested variables into account, multilevel analysis (MLwiN 2.25) was used to examine the effects of demographic teacher variables, ICTrelated and general teacher variables and ICT related and general school characteristics on teachers’ Institutionalised ICT use on differentlevels (see Fig. 1). A multilevel technique based on hierarchical regression was used to investigate the data because of its nested structure(teachers in schools).

The conceptual framework (Fig. 1) of this study guided seven main models that were tested. First, a fully unconditional null model wastested to investigate whether a multilevel level approach was preferred to a single level linear regression. The following five modelsrespectively added demographic teacher variables, ICT related teacher variables, general teacher variables, ICT related school conditions andgeneral school conditions. Before the factors of a next subset of variables were added to the model, the non-significant factors were firstdeleted from the model. Using a stepwise multilevel approach made it possible to check the additional value of each subset of variables tothe model as to the proportion of explained variance (Gorard, 2003; Hermans, Tondeur et al., 2008; Hermans, van Braak et al., 2008). Sincethe school conditions weremeasured through teacher data, aggregatemeasures expressed as means were calculated usingMLwiN. In a finalseventh model, complex variance at the teacher and school level was allowed for all the remaining variables.

R. Vanderlinde et al. / Computers & Education 72 (2014) 1–10 5

4. Results

4.1. Descriptive statistics and scale reliability

The dependent variable ‘Institutionalised ICT use in primary education’ was measured using 13 items. The selection of these items isbased on an item analysis on the items of all four ICT use measurement scales (Basic ICT skills, ICT as a learning tool, ICT as an informationtool, and Innovative ICT use). Of all these items, we selected only those items with anM > 1.20 considering these items as the most used byprimary school teachers, further defined as items representing ‘Institutionalised ICT use’. Of course, we have to be aware that the cut-offscore of M > 1.20 is based on the results of the item analysis as there was a distance in the frequency between item 13 (M ¼ 1.20) and thenext item 14 (M ¼ .94).

Further, we have to be aware that there is still a lot of improvement possible within the frequency of ICT use of the 13 selected items.Nevertheless, this study fulfils its ambition to have an exclusive focus on ICT activities for teaching and learning that are commonly andfrequently used. The item analysis resulted in a collection of 13 items, which covered themain ICTactivities that pupils do in primary classes.Table 1 presents the 13 items and their descriptive statistics. This table reveals that items from the four ICT use measurement scales (B-ICT,LEAR, INF, INNO) in the conceptual framework are represented in the selection of the 13 items. The overall scale score of this new mea-surement scale ‘Institutionalised ICT use’ in primary education was 45.88 (SD ¼ 19.38) with a theoretical range from 0 till 100. The newlydeveloped scaled showed a good internal consistency (a ¼ .87) and is therefore considered a reliable instrument (see Table 2). Exploratoryfactor analysis (scree plot and parallel analysis) confirmed the one-factor solution of this new variable accounting for 39.7 of the commonvariance. Factor loadings range from .46 to .75.

Table 2 gives an overview of the descriptives, reliability coefficients (Cronbach alpha’s) and correlates of each of the used scales that weresubsequently integrated in themultilevel model. The results show that the values of the correlations are acceptable. Moreover, the Cronbachalpha’s (ranging from .73 to .92) indicate good reliability of all used measurements.

4.2. Multilevel model

4.2.1. Null modelDuring the first part of the analysis no level 1 (teacher) and level 2 (school) predictors were integrated in the two-level random intercepts

model. Consequently, the intercept of this null model refers to the overall mean score (M¼ 45.88) on the ‘Institutionalised ICT use’ scale of allteachers in all schools. Assuming a normal distributionwith a mean of 45.88 and a variance of 52.51, the 95% coverage bounds reveal that inthe 2.5% schools where teachers do most of the institutionalised ICT activities, the average score on the ‘Institutionalised ICT use scale’ is60:08ð45:88þ 1:96

ffiffiffiffiffiffiffiffiffiffiffiffiffi52:51

pÞ. In the 2.5% schools where teachers do least of the ICT activities, the average score on the ‘Institutionalised ICT

use’ scale is 31:68ð45:88� 1:96ffiffiffiffiffiffiffiffiffiffiffiffiffi52:51

pÞ. Furthermore, the results support the use of ML-modelling for studying teachers’ ‘Institutionalised

ICT use’. Both the between-school variance (school level; s2u0¼ 52.506, c2¼ 7.771, df¼ 1, p< .01) andwithin-school variance (teacher level;s2e0 ¼ 324.534, c2 ¼ 191.417, df ¼ 1, p < .001) differ significantly from zero. The calculation of the variance partitioning coefficient (VPC)delivers the proportion of the variance that is attributed to differences between schools. This is done by dividing the between-schoolvariance by the total variance of ‘Institutionalised ICT use’ among teachers (52.51/(52.51 þ 324.53) � 100). The results indicate that13.93% of the variance is due to between-school differences whereas 86.07% is attributed to differences at the teacher level. Hence, the two-level null model in which the variance at the school level is taken into account, is a significant improvement over the single level model(c2 ¼ 20.063, df ¼ 1, p < .001).

The school effect is illustrated by the rank order of the school level residuals (see Fig. 2). The schools are ranked by the degree in whichthey performwidely accepted or institutionalised ICT activities. The caterpillar plot shows that school 4 has the highest average score on the‘Institutionalised ICT use’ scale (45.88 þ 12.04 ¼ 57.92), whereas school 52 has the lowest average score (45.88 � 10.69 ¼ 35.19).

Table 1Descriptives of the ‘Institutionalised ICT use in primary education scale’.

M SD Never(%)

Everyterm (%)

Monthly(%)

Weekly(%)

Daily(%)

Originalscale

Pupils in my class learn the basic skills to use ICT 2.48 .97 4.6 11.1 25.9 48.7 9.7 B-ICTMy pupils use educational software and instructional

computer programmes to learn2.34 1.10 8.8 11.3 27.7 41.1 11.1 LEAR

Pupils in my class learn to use the computer machineand other ICT peripheral equipment

2.26 1.17 11.8 13.4 22.2 42.7 9.9 B-ICT

My pupils learn to use ICT in a safe manner 2.19 1.22 13.6 13.9 23.8 37.4 11.3 B-ICTMy pupils use educational software and instructional

computer programmes to make exercises2.10 1.16 11.5 18.2 27.7 33.5 9.0 LEAR

Pupils in my class use ICT for remedial assignments 2.05 1.18 14.8 14.8 28.4 34.6 7.4 LEARIn my class pupils use ICT to look up and select information

(e.g. Google, Yahoo, etc.)1.79 1.26 22.4 17.1 26.6 27.0 6.9 INF

My pupils learn about ICT because I use ICT during classical instruction 1.68 1.48 31.6 18.7 16.6 15.9 17.1 INFMy pupils use ICT to store information 1.66 1.27 25.6 20.3 21.7 26.8 5.5 INFIn my class, pupils with learning problems use appropriate

educational software and instructional computer programmes1.61 1.36 32.8 14.5 19.6 25.4 7.6 LEAR

In my class, pupils use digital databases (e.g. Wikipedia, GoogleEarth,GoogleBooks, etc.) to look up for information

1.33 1.21 34.4 21.0 25.4 15.2 3.9 INNO

My pupils use ICT to save and to share files with each other 1.24 1.32 44.3 15.9 16.4 18.2 5.1 B-ICTIn my class, we make use of simulation software, whiteboards,

beamers to exemplify and/or explain complex matters1.20 1.30 56.8 12.5 6.9 7.9 15.9 INNO

Table 2Reliability coefficients, descriptives and correlates.

N a M SD 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

1. Sex – – – – 12. Age – – 39.19 9.73 .12* 13.Participative decision

making5 .78 52.51 11.59 �.02 �.09 1

4. Professional relationships 7 .85 69.83 15.06 �.12* �.20** .44** 15. Structure initiating

leadership5 .85 76.54 15.48 �.14** �.08 .42** .27** 1

6. Supportive leadership 7 .92 71.38 19.48 �.09 �.20** .58** .43** .65** 17. Schools’ ICT vision

and policy7 .92 58.03 18.11 �.03 .10* .29** .20** .42** .20** 1

8. ICT School support andcoordination

7 .88 70.12 17.34 �.07 �.01 .28** .31** .26** .27** .58** 1

9. ICT infrastructure 4 .83 59.37 22.94 �.01 �.01 .28** .23** .19** .25** .38** .41** 110. Teacher efficacy 12 .90 79.34 7.73 �.00 .08 .10* .13** .15** .07 .23** .17** .19** 111. Innovativeness 5 .87 27.91 17.55 .19** .25** �.15** �.21** �.08 �.14** �.13** �.30** �.14** �.18** 112. Developmental

educational beliefs9 .65 70.14 9.63 �.07 �.11* .22** .32** .17** .22** .15** .20** .11* .17** �.27** 1

13. Traditional educationalbeliefs

9 .73 58.32 11.03 �.05 �.09 .05 .01 .17** .12* .10* .06 .06 .04 .16** .16** 1

14. ICT professionaldevelopment

4 .84 46.59 21.11 .01 .00 .14** .13** .12* .06 .35** .25** .28** .23** �.24** .16** .11* 1

15. ICT competences 5 .85 63.03 19.41 .08 �.32** .23** .18** .12* .15** .26** .29** .29** .16** �.42** .20** .00 .53** 116. Institutionalised

ICT use13 .87 45.90 19.38 .05 �.12 .22** .16** .13** .16** .31** .19** .39** .21** �.23** .22** .02 .41** .41** 1

*Significant at the .05 level; **significant at the .01 level.

R. Vanderlinde et al. / Computers & Education 72 (2014) 1–106

4.2.2. Model 1During the second stage of the analysis, level 1 (teacher) variables were integrated in the two-level random intercepts model. First the

demographic variables age and sex (teacher level) were integrated in the fixed part of model 1a to look for an association with the score onthe ‘Institutionalised ICT use’ scale. With regard to sex, the category male is chosen as reference category. Since sex (c2 ¼ 2.383, df ¼ 1,p> .05) did not make a significant contribution, it was removed from themodel. The significant effect of age disappeared (c2¼ 3.773, df¼ 1,p > .05) as the variable was integrated as a sole variable in model 1b. Consequently, age was omitted for further analyses.

4.2.3. Model 2In the next step, the ICT related variables, ICT professional development and ICT competence (teacher level) are included in the fixed part

of themodel. The variables are centred around their grandmean (see Table 2) tomake interpretation feasible. The intercept 45.93 inmodel 2represents the overall mean for ‘Institutionalised ICT use’ across teachers with an average score on ICT professional development and ICTcompetence. Both ICT professional development and ICT competence make a significant contribution (c2 ¼ 28.480, df ¼ 1, p < .001 andc2 ¼ 26.338, df ¼ 1, p < .001 respectively) causing a substantial reduction in the between-school variance. The positive fixed slope .24associated with ICT professional development indicates that teachers who keep up with developments in the field of ICT integration, have ahigher overall mean score on ‘Institutionalised ICT use’ among all schools. Similarly the positive fixed sloped .26 of ICT competence showsthat teachers who consider themselves as competent in integrating ICT into their classroom, get a higher overall mean score on ‘Institu-tionalised ICT use’ among all schools. Model 2 significantly fits the data better than the null model (c2 ¼ 98.774, df ¼ 2, p < .001).

Fig. 2. School level residuals with 95% confidence bands.

Table 3Estimates and standard errors from the random intercept model (dependent variable: Institutionalised ICT use among teachers).

Model 0 Model 1a Model 1b Model 2 Model 3a Model 3b Model 4a Model 4b Model 4c Model 5 Model6

FixedIntercept (cons) 45.88

(1.36)***49.08

(2.48)***45.84

(1.33)***45.93

(1.05)***45.99

(1.02)***45.99

(1.01)***46.30

(.96)***46.52

(.94)***46.51

(.94)***46.33

(.95)***46.58(.98)***

Age �.20(.10)* �.19(.10) – – – – – – – –

Sex �3.84(2.49)

– – – – – – – – –

Professionaldevelopment

.24(.05)*** .23(.05)*** .22(.05)*** .20(.05)*** .20(.05)*** .22(.05)*** .21(.05)*** .21(05)***

ICT competence .26(.05)*** .23(.05)*** .24(.05)*** .21(.05)*** .22(.05)*** .22(.05)*** .21(.05)*** .21(05)***Teacher efficacy .24(.11)* .24(.11)* .22(.11)* .20(.11) – – –

Innovativeness �.00(.05) – – – – – –

Developmental beliefs .23(.09)** .22(.09)* .23(.09)** .23(.09)** .25(.09)** .25(.09)** .28(.09)**Traditional beliefs �.10(.08) – – – – – –

ICT vision and policy .31(.10)** .26(.08)** .27(.08)*** .26(.09)** .26(.09)**ICT support and

coordination�.05(.11) – – – –

Structure initiatingleadership

.11(.12) –

RandomSchool level s2u0

(between)52.51

(18.84)**47.30

(17.79)**47.98

(17.90)**21.69

(11.02)*20.28

(10. 51)18.76

(10.23)11.88

(8.61)9.96

(7.88)11.67

(8.57)11.15

(8.48)15.12

(9.21)Teacher level s2e0

(within)(ICT vision and policy)

324.53(23.46)***

321.98(23.25)***

323.57(23.38)***

268.46(19.35)***

260.16(18.75)***

262.14(18.89)***

255.03(18.73)***

247.04(17.76)***

257.38(18.67)***

257.94(18.95)***

223.66(21.51)***

Model FitDeviance (2-log) 3774.968 3768.883 3771.259 3676.194 3661.881 3663.558 3476.302 3577.509 3580.911 3480.034 3577.283c2 20.063 98.774 12.636 82.647df 1 2 2 0p <.001 <.001 <.01 <.001Variancer (%) 13.93 7.48 6.68 4.34

*Significant at the .05 level, **significant at the .01 level, ***significant at the .001 level.

R.Vanderlindeet

al./Com

puters&

Education72

(2014)1–10

7

R. Vanderlinde et al. / Computers & Education 72 (2014) 1–108

4.2.4. Model 3Subsequently, the general teacher conditions efficacy, innovativeness, developmental and traditional educational beliefs were added to

model 3. Since there was no significant effect of innovativeness (c2 ¼ .000, df ¼ 1, p > .05) and traditional educational beliefs (c2 ¼ 1.612,df ¼ 1, p > .05) in model 3a, both were no longer retained as variables in model 3b. In this model, the intercept 45.99 represents the overallmean for ‘Institutionalised ICT use’ across teachers with an average score on ICT professional development, ICT competence, efficacy anddevelopmental educational beliefs. Both efficacy (c2 ¼ 4.815, df ¼ 1, p < .05) and developmental beliefs (c2 ¼ 6.213, df ¼ 1, p < .05)significantly contribute to the model. Their positive slopes indicate that for every increase with one unit, the score on the ‘InstitutionalisedICT use scale’ increases with .24 respectively .22. With regard to model improvement, model 3b seems to fit the data better than model 2(c2 ¼ 12.636, df ¼ 2, p < .01). Considering the random part of the model, the enclosure of efficacy and developmental educational beliefsresults in a non-significant between-school variance (c2 ¼ 3.362, df ¼ 1, p > .05). However, the between-teachers variance remains sig-nificant (c2 ¼ 192.619, df ¼ 1, p < .001).

4.2.5. Model 4In the third stage of model specification, level 2 (school) variables were added to the model. In a first step, the ICT related school level

variable ‘schools’ ICT vision and policy’, ‘ICT-support and coordination’ and ‘ICT infrastructure’ were integrated in model 4a. Since theseschool level variables were measured at the teacher level, aggregate measures expressed as means were created. The reliability of theaggregates was calculated using the intraclass correlation and number of microlevel units used for averaging (lj ¼ njrI/(1 þ (nj � 1)rI))(Snijders & Bosker, 2012; Van Mierlo, Vermunt, & Rutte, 2009). All variables with a reliability coefficient below .70 are not presented in theoverview Table 3.With regard to the aggregate variable ‘ICT infrastructure’, more than 50% of the reliability coefficients did notmeet the cut-off score of .70. Consequently ‘ICT infrastructure’was omitted for further analysis. ‘ICT-support and coordination’ (c2 ¼ .258, df ¼ 1, p > .05)did not significantly contribute to the model and was therefore removed from the model. Since teacher efficacy (c2 ¼ 3.418, df ¼ 1, p > .05)was no longer significant in model 4b, this variable was no longer retained. The significant positive slope associated with ‘schools’ ICT visionand policy’ (c2 ¼ 11.746, df ¼ 1, p < .001) in model 4c indicates that teachers who work in schools with a well-considered ICT policy, have ahigher score on the ‘Institutionalised ICT use’ scale. Compared to model 3b, the model significantly improved by adding the school levelvariable ‘schools’ ICT vision and policy’ (c2 ¼ 82.567, df ¼ 0, p < .001).

4.2.6. Model 5In the next step it was intended to estimate a model containing the general school level variables ‘participative decision making’, ‘school

vision’, ‘professional relationships, ‘structure initiating leadership’ and ‘supportive leadership’. Based on the reliability of the variables’aggregates, only ‘structure initiating leadership’ met the .70 cut-off score and was retained as a variable for further analysis. However, thepositive slope associated with ‘initiating structure’ leadership was not significant (c2 ¼ .858, df ¼ 1, p > .05) and was therefore no longerincluded in the model.

4.2.7. Model 6In the final stage of model specification, complex variance for all the variables of model 4c was allowed at the teacher and school level.

However, only the random variation of the slope for ‘schools’ ICT vision and policy’ at teacher level was significant (c2 ¼ 4.496, df ¼ 1,p < .05). The non-significant variances and covariances were set to zero to simplify model 6. However, compared to model 4c, this did notresulted in an improved model (c2 ¼ 3.628, df ¼ 1, p > .05). Consequently, model 4c is retained as the final model.

Finally, the calculation of the squaredmultiple correlation coefficient (R2) offers us the proportion of explained variance, which is dividedinto the explained variance at the teacher and at the school level. The calculation of both proportions of explained variances is based on thealternative approach of Snijders and Bosker (1999, 2012). R21 at the teacher level is defined as the proportional reduction of error for predictingan individual outcome ðR21 ¼ 1� ððbs2

e0 þ bs2u0Þconditional=ðbs2

e0 þ bs2u0ÞunconditionalÞÞ and R22 at the school level as the proportional reduction of

error for predicting a group mean ðR22 ¼ 1� ðððbs2e0=~nÞ þ bs

2u0Þconditional=ððbs2

e0=~nÞ þ bs2u0ÞunconditionalÞÞ (Jee-Seon, 2009). The results in Table 3

indicate the proportion of explained variance for both the teacher ðR21Þ and school level ðR22Þ for the subsequent models that were used.The calculation of DR21 and DR22 allows us to identify the proportion of variance explained at the two levels by each set of predictors that wasintegrated in the subsequent models. Compared tomodel 2 (see Table 4), the proportion of explained variance in model 3b rises with 2.4% atteacher level and 3.7% at school level. This increase in the proportion of explained variance is due to the addition of ‘teacher efficacy’ and‘developmental beliefs’ to the model. The inclusion of ‘schools’ ICT vision and policy’ resulted in an extra 3.1% and 7.3% explained variance atthe teacher level respectively school level. Teacher related ICT variables ‘ICT competence’ and ‘professional development’ accounted for23.1% of the variance at teacher level and 37.4% at the school level.

5. Discussion and conclusion

By presenting a multilevel model of influencing school- and teacher-level factors, this study shed light on the complex process of ICT usefor teaching and learning in primary education. As such, the results are of particular importance for both researchers and policy makers inthe field of ICT integration. More concretely, this study adds to the literature in several respects. First, by presenting a newway of measuring

Table 4Proportion of variance explained at the teacher and school level.

Model 2 Model 3b Model 4c

R21 (proportion of variance explained at the teacher level) .231 .255 .286DR21 .024 .031R22 (proportion of variance explained at the school level) .374 .411 .484DR22 .037 .073

R. Vanderlinde et al. / Computers & Education 72 (2014) 1–10 9

ICT use by only focussing on those ICT activities that pupils and teachers most frequently do in primary schools. No studies in the ICTintegration research literature have been examining this kind of ICT use. This new variable ‘Institutionalised ICT use in primary education’represents a broad spectrum of ICT activities for teaching and learning as the variable is based on existing measurement scales. On the otherhand, although we selected those items that represent the most used ICT activities, the frequencies of the 13 selected items are still rathermodest. This is similar to earlier research (Smeets, 2005; Tondeur et al., 2007), and again stresses there is room for improvement in realisingICT integration. Further, ‘Institutionalised ICT use in primary education’ is proved to be a valid and reliable measurement instrument. By onlyfocussing on the most used applications and practices, this new measurement scale can be considered as an instrument oriented towardsInstitutionalised ICT use. In other words, the scale is oriented towards full implementation and integration of ICT in classrooms. Second, theresults of the multilevel analysis illustrate that ‘Institutionalised ICT use’ should not only be considered as a teacher phenomenon but also asa school phenomenon. The null model shows that about 14% of the variance in Institutionalised ICT use is due to between-school differences.This implies that an education innovation like ICT –which is clearly situated in classrooms – can only be fully understood by taking a schoolperspective into account. Furthermore, the stepwise approach in the multilevel modelling analysis allowed us to study the complexinterplay of the different contributing factors identified in the e-capacity model. In a final model (see model 4c), the variables ‘ICT pro-fessional development’, ‘ICT competences’, ‘developmental educational beliefs’, and ‘schools’ ICT vision and policy’ are identified as variablesassociated with ‘Institutionalised ICT use’. In this context, it is remarkable that the two specific ICT related teacher conditions – ICTcompetence and ICT professional development activities – accounted for 23% of the variance at teacher level, and even for almost 38% of thevariance at the school level. This is interesting as it illustrates that individual differences between teachers in ICT competences are explainedby differences of the schools in which they work. This means that a concept like competences – which is regarded as something very in-dividual – can be influenced by organisational features. This implies that schools as organisations have a major role to play in the devel-opment of individual teacher ICT competence and interconnected (Vanderlinde & van Braak, 2010) ICT professional development activities.This demonstrates the relevance of (see for instance Valcke, Rots, Verbeke, & van Braak, 2007) organising ICT training activities for schoolteams instead of aiming these training activities on the individual teachers. In this context, electronic school team guiding systems like theFlemish pICTos tool (Vanderlinde, Tondeur, & van Braak, 2010) are worth tomention as such tools encourage full teacher participation in ICTpolicy development which includes professional development. Besides, in times of economic difficulties, it is important to keep on investingin professional development activities for teachers to develop their ICT competences, although these in-service training initiatives aresensitive to austerity measures. Relevant in this context is the plea of Galanouli, Murphy, and Gardner (2004) who argue that ICT profes-sional development should reflect the level of ICT competence of the teachers involved. In this context, Cope and Ward (2002) argue thatteachers not only need instruction in terms of ICT use, but also need professional development in terms of how educational technology canbe used to enhance learning outcomes in students. ICT training activities always need a focus on both pedagogical aspects and teachers’ ICTskills (BECTA, 2004), and need to be imbedded in a supportive professional school culture (Dexter, Anderson, & Becker, 1999). The results ofthis study underline these statements and clearly emphasise the need to situate ICT use in a professional school culture. In this context,especially the condition of ‘ICT vision and policy development’ is of crucial importance. This finding is useful for school leaders as it un-derlines the importance of having a shared vision on the place of ICT in education and having a school based ICT policy plan.

Further research should focus on the specific variable ‘Teachers’ ICT competences’ as this study underlines the importance of this var-iable. It is important to stress that ‘Teachers’ ICT competences’ in this study are measured as self-reported data. Such an indirect mea-surement of ICT or digital competences provides a relatively weak indicator of actual competence, because research has already shown amerely moderate relationship between self-evaluation and ability (Mabe & West, 1982). So further research on teachers’ ICT competencesshould try tomeasure teachers’ ICT competences in a direct and authentic way. Research in this domain is still in its infancy, and new studiesare required to understand cognitive determinants of teachers’ digital competencies and their relation with ICT integration in teaching andlearning processes. In contrast, some initiatives are being developed to measure digital competences of students, (e.g. the InternationalComputer and Information Literacy Study of the IEA (Fraillon & Ainley, 2010)), and the iSkills study of ETS (ETS, 2002). These studies attemptto measure students’ digital competences in a direct and authentic way by using computer-based environments. Similar to these initiatives,research projects are needed to measure digital competences of teachers in similar computer-based environments. In a next step, researchcan then combine these two direct measurement instruments in new research designs. Next to these studies, qualitative studies are neededto further unravel the complex interplay between nested variables that may explain ICT use in classrooms. In-depth interviews withteachers may be a possibility to explore these relationships as they are aimed towards deeper understanding of teachers’ sense-making, incombination with a holistic focus on what they do (e.g. curriculum content, goals, and outcomes) when implementing ICT-supportedinnovative pedagogical practices (see Voogt & Pelgrum, 2005).

References

Aesaert, K., Vanderlinde, R., Tondeur, J., & van Braak, J. (2013). About the content of educational technology curricula: a cross-curricular state of the art. Educational TechnologyResearch & Development, 61, 131–151.

Afshari, M., Bakar, K. A., SuLuan, W., Samah, B. A., & Say, F. (2009). Factors affecting teachers’ use of information and communication technology. International Journal ofInstruction, 2, 77–104.

Anderson, R. E. (2002). Guest editorial: International studies of innovative uses of ICT in schools. Journal of Computer Assisted Learning, 18, 381–386.Anderson, R. E. (2008). Implications of the information and knowledge society for education. In J. Voogt, & G. Knezek (Eds.), International handbook of information technology

in primary and secondary education (pp. 5–22). New York: Springer.Baylor, A. L., & Ritchie, D. (2002). What factors facilitate teacher skill, teacher morale, and perceived student learning in technology-using classroom? Computers & Education,

39, 395–414.BECTA. (2004). A review of the research literature on barriers to the uptake of ICT by teachers. Available from http://dera.ioe.ac.uk/1603/1/becta_2004_barrierstouptake_

litrev.pdf Accessed 14.03.12.BECTA. (2007). Evaluation of the Test Bed project. Retrieved online January 2, 2010 from http://www.google.be/firefox?client¼firefox-a&rls¼org.mozilla:nl:official.Cope, C., & Ward, P. (2002). Integrating learning technology into classrooms: the importance of teacher’ perceptions. Educational Technology & Society, 5, 67–74.Cox, M. (2008). Researching IT in education. In J. Voogt, & G. Knezek (Eds.), International handbook of information technology in primary and secondary education (pp. 965–982).

New York: Springer.Dexter, S., Anderson, R. E., & Becker, H. J. (1999). Teachers’ view of computers as catalysts for changes in their teaching practice. Journal of Research on Computing in Education,

31, 221–239.

R. Vanderlinde et al. / Computers & Education 72 (2014) 1–1010

Educational Testing Service (ETS) (2002). Digital transformation: A framework for ICT literacy. A report of the International ICT Literacy Panel. ETS, Center for GlobalAssessment. Princeton.

Fraillon, B. J., & Ainley, J. (2010). The IEA international study of ICILS [online]. Available from http://icils2013.acer.edu.au/wp-content/uploads/examples/ICILS-Detailed-Project-Description.pdf Accessed 12.03.12.

Galanouli, D., Murphy, C., & Gardner, J. (2004). Teachers’ perceptions of the effectiveness of ICT competence training. Computers & Education, 43, 63–79.Geijsels, F., Sleegers, P., Stoek, R. D., & Krüger, M. L. (2009). The effect of teacher psychological school organizational and leadership factors on teachers’ professional learning

in schools. The Elementary School Journal, 109, 406–427.Geijsels, F., Sleegers, P., Van den Berg, R., & Kelchtermans, G. (2001). Conditions fostering the implementation of large-scale innovation programs in schools: teachers’

perspectives. Educational Administration Quarterly, 37, 130–166.Gorard, S. (2003). What is multi-level modelling for? British Journal of Educational Studies, 51, 46–63.Granger, C. A., Morbey, M. L., Lotherington, H., Owston, R. D., & Wideman, H. H. (2002). Factors contributing to teachers’ successful implementation of IT. Journal of Computer

Assisted Learning, 18, 480–488.Hannafin, M. J., & Land, S. M. (1997). The foundations and assumptions of technology-enhanced student-centred learning environments. Instructional Science, 25, 167–202.Hermans, R. (2009). The influence of educational beliefs on the use of ICT as an educational innovation in primary education [PhD thesis]. Ghent: Ghent University, Department of

Educational Studies.Hermans, R., van Braak, J., & Van Keer, H. (2008). Development of the beliefs about primary education scale: distinguishing a developmental and transmissive dimension.

Teaching and Teacher Education, 24, 127–139.Hermans, R., Tondeur, J., van Braak, J., & Valcke, M. (2008). The impact of primary school teachers’ educational beliefs on the classroom use of computers. Computers &

Education, 51, 1499–1509.Hoy, W., & Tarter, C. J. (1995). Administrators solving the problems of practice: Decision-making cases, concepts and consequences. Boston: Allyn & Bacon.Hoy, W., & Tarter, C. J. (1997). The road to open and healthy schools: A handbook for change. Thousand Oaks: Corwinn Press.Jee-Seon, K. (2009). Multilevel analysis: an overview and some contemporary issues. In R. E. Millsap, & A. Maydey-Olivares (Eds.), The Sage handbook of quantitative methods in

psychology (pp. 337–361). London: Sage.Karabenick, S. A. (2011). Classroom and technology-supported help seeking: the need for converging research paradigms. Learning and Instruction, 21, 290–296.de Koster, S., Kuiper, E., & Volman, M. (2012). Concept-guided development of ICT use in ‘traditional’ and ‘innovative’ primary schools: what types of ICT use do schools

develop? Journal of Computer Assisted Learning, 28, 454–464.Kozma, R. (2003a). Technology and classroom practices: an international study. Journal of Research on Technology in Education, 36, 1–14.Kozma, R. (2003b). ICT and educational change: a global phenomenon. In R. Kozma (Ed.), Technology, innovation, and educational change: A global perspective (pp. 1–18).

Eugene: International Society for Technology in Education.Mabe, P. A., & West, S. W. (1982). Validity of self-evaluation of ability: review and meta analysis. Journal of Applied Psychology, 67, 280–296.Meneses, J., Fàbregues, S., Rodríguez-Gómes, D., & Ion, G. (2012). Internet in teachers’ professional practice outside the classroom: examining supportive and management

uses in primary and secondary schools. Computers & Education, 59, 915–924.Niederhauser, D. S., & Stoddart, T. (2001). Teachers’ instructional perspectives and use of educational software. Teaching and Teacher Education, 17, 15–31.Smeets, E. (2005). Does ICT contribute to powerful learning environments in primary education? Computers & Education, 44, 343–355.Snijders, T., & Bosker, R. (1999). Multilevel analysis. An Introduction to basic and advanced multilevel modeling. London: Sage Publications.Snijders, T., & Bosker, R. (2012). Multilevel analysis (2nd ed.).. In An introduction to basic and advanced multilevel modeling London: Sage Publications Inc.Staessens, K. (1990). The professional culture of primary schools in innovation: Each school has its story. Leuven: Universitaire Pers.Tondeur, J., van Braak, J., & Valcke, M. (2007). Towards a typology of computer use in primary education. Journal of Computer Assisted Learning, 23, 197–206.Tschannen-Moran, M., & Woolfol Hoy, A. (2001). Measurement of teacher sense of efficacy. Teaching and Teacher Education, 17, 783–805.Valcke, M., Rots, I., Verbeke, M., & van Braak, J. (2007). ICT teacher training: evaluation of the curriculum and training approaches. Teaching and Teacher Education, 23, 795–

808.Van Braak, J. (2001). Individual characteristics influencing teachers’ class use of computers. Journal of Educational Computing Research, 25, 141–157.Van Driel, J. H., Bulte, A. M. W., & Verloop, N. (2007). The relationships between teachers’ general beliefs about teaching and learning and their domain specific curricular

beliefs. Learning and Instruction, 17, 156–171.Van Mierlo, H., Vermunt, J. K., & Rutte, C. G. (2009). Composing group-level constructs from individual-level survey data. Organizational Research Methods, 12, 368–392.Vanderlinde, R., & van Braak, J. (2010). The e-capacity of primary schools: development of a conceptual model and scale construction from a school improvement perspective.

Computers & Education, 55, 541–553.Vanderlinde, R., van Braak, J., & Hermans, R. (2009). Educational technology on a turning point: curriculum implementation in Flanders and challenges for schools. ETR&D

Educational Technology Research and Development, 57, 573–584.Vanderlinde, R., van Braak, J., & Tondeur, J. (2010). Using an online tool to support school-based ICT policy planning in primary education. Journal of Computer Assisted

Learning, 26, 434–447.Voogt, J., & Pelgrum, H. (2005). ICT and curriculum change. Human Technology: An Interdisciplinary Journal on Humans in ICT Environments, 1, 157–175.Watson, D. (2006). Understanding the relationship between ICT and education means exploring innovation and change. Education and Information Technology, 31, 307–320.Yuen, A. H. K., Law, N., Lee, M. W., & Lee, Y. (2010). The changing face of education in Hong Kong: Transition into the 21st century. Hong Kong: Centre for Information Technology

in Education.