Development of an electronic Portfolio system success model: An information systems approach
Post on 25-Nov-2016
ePortfolio review process
s, cons6; Fernat ePosuccening (Eer, exp
that constitute a successful ePortfolio implementation.The D&M model is well-known. Originally developed in 1992, it has been used and cited in more than 100 papers (DeLone & McLean,
1992; Petter, DeLone, & McLean, 2008). An updated version was developed ten years later and has also been widely used (DeLone &McLean, 2003; Petter et al., 2008). Given its acceptance in the IS research community, the D&M model constitutes a suitable theoretical
* Corresponding author. Tel.: 385 42 390 858; fax: 385 42 213 413.E-mail addresses: firstname.lastname@example.org (I. Balaban), email@example.com (E. Mu), firstname.lastname@example.org (B. Divjak).
1 Tel.: 1 412 578 8729.2
Contents lists available at SciVerse ScienceDirect
Computers & Education
Computers & Education 60 (2013) 396411Tel.: 385 42 390 842.deployment of an ePortfolio in an academic institution, it is necessary to take into account the different possible dimensions: ePortfolio asa source of quality educational information that needs to be incorporated into the curriculum, ePortfolio as an information system archi-tecture that needs to satisfy the users needs and also be reliable, usable and seamlessly incorporated into the institutional ICT architecture,and ePortfolio as a new phenomenon that enhances learning by bringing the learner closer to the educational institution and potentialemployer while becoming a source of personal growth and development.
The proposed study on ePortfolio success employs an information system approach that takes the above-mentioned aspects intoconsideration. The success of an ePortfolio can, therefore, be interpreted as equivalent to the success of a specic information system (IS). Forthis reason, the DeLone and McLean (2003) model (hereafter D&Mmodel), which has been widely used to measure IS success, is used toassess the success of an ePortfolio system. According to this model, IS success consists of six interconnected constructs. The presence ofinner connections among the six dimensions (constructs) needs to be established to comprehend the exact structure and dependencies1. Introduction
Electronic Portfolios, or ePortfolioyears (Buzzetto-More & Alade, 200ePortfolio literature review reveals thFurthermore, a model that describes
The European Institute for E-learillustrating a persons learning, care0360-1315/$ see front matter 2012 Elsevier Ltd.http://dx.doi.org/10.1016/j.compedu.2012.06.013review process are also discussed. 2012 Elsevier Ltd. All rights reserved.
titute an extension to e-learning. They have become an important research topic in the last fewndez, 2008; Mu, Wormer, Foizey, Barkon, & Vehec, 2010; Stevenson, 2006). An extensivertfolio systems are widely used but still not thoroughly studied in all their different dimensions.ssful implementation of an ePortfolio system does not exist yet.IfEL) (2009) denes ePortfolio as a personal digital collection of information describing anderience and achievements. This denition suggests that in order to assess the success of theePortfolio successePortfolio system successmentation and use of ePortfolio systems through the analysis of the causal relationships of their differentdimensions. Finally, initial guidelines about how to use the instrument as part of an ePortfolio systemDevelopment of an electronic Portfolio system success model: An informationsystems approach
Igor Balaban a,*, Enrique Mu b,1, Blazenka Divjak a,2
aUniversity of Zagreb, Faculty of Organization and Informatics Varazdin, Pavlinska 2, 42 000 Varazdin, CroatiabCarlow University, 3333 Fifth Avenue, Pittsburgh, PA 15213, USA
a r t i c l e i n f o
Article history:Received 11 January 2012Received in revised form4 June 2012Accepted 10 June 2012
a b s t r a c t
This research has two main goals: to develop an instrument for assessing Electronic Portfolio (ePortfolio)success and to build a corresponding ePortfolio success model using DeLone and McLeans informationsystems success model as the theoretical framework. For this purpose, we developed an ePortfoliosuccess measurement instrument and structural model, at the individual level of analysis, usingresponses from 186 ePortfolio student users from higher education institutions worldwide. Academicinstitutions can use the results of this research to assess the success of their ePortfolio implementationsfrom their students perspective. The ePortfolio success model can also help to improve the imple-
journal homepage: www.elsevier .com/locate/compeduAll rights reserved.
framework to assess ePortfolio system success. However, because the D&M model has not been previously used in the ePortfolio context,a new set of model elements needs to be considered that match ePortfolio requirements. Themain purpose and contribution of this researchis the development of an ePortfolio success model applicable at the individual level of analysis.
2. Background on ePortfolios
Since Portfolios, in general, were developed to support the learning and teaching process and have been used for a long time, there arenumerous denitions of student learning portfolios proposed by educators. The literature reveals a dozen possible denitions of the portfolioterm; however, there are three that in our opinion illustrate the most important different aspects of the term. A comprehensive denitionwasofferedbyPaulson, Paulson, andMeyer (1991),whodescribed Portfolios as ameaningful collectionof studentwork thatdemonstratesprogressand/or mastery guided by standards and includes evidence of student self-reection. Abrenica (1996) dened Portfolios as a collection ofstudentachievementartifacts createdduringaperiodof time that serveasauthentic assessment toolsused toevaluate student learning. Finally,Barrett (1998) dened Portfolios as a purposeful collection of students work that illustrates efforts, progress and achievement.
All these denitions describe Portfolio as a concept or a set of procedures and data that result in the demonstration of a studentscapabilities. However, the full educational potential of Portfolios is obtained when the procedures and data identied in the aforementioned
I. Balaban et al. / Computers & Education 60 (2013) 396411 397denitions are implemented as an information system. The current trend is moving toward implementing Portfolio initiatives as infor-mation systems that are supported by current information and communication technology (ICT) and in particular web-based technology.For this reason, this research refers exclusively to Web-based Portfolios. In order to differentiate a paper Portfolio from its electroniccounterpart, the letter e will be added to the word Portfolio. Therefore, the term ePortfolio will be used hereafter to denote the mostpopular type of electronic Portfolio, i.e. the Web-based Portfolio being used currently.
Although the analysis of the aforementioned denitions may suggest that ePortfolios are used only by students, other entities such asadministration, potential employers, lifelong learners and education institutions can use ePortfolios as well. Drawing on previous denitionsand taking into consideration that none of them specically included the IT component, new denitions of ePortfolio have been coined.Barker (2003) considers ePortfolios as an electronic learning record which enables an individual to store, organize and present their workand accomplishments. The EIfEL, which leads the Europortfolio consortium and is a founding member of the European Foundation forQuality in E-Learning (EFQUEL), denes ePortfolio as a personal digital collection of information describing and illustrating a personslearning, career, experience and achievements. Furthermore, the denition proposed by EIfEL emphasizes that ePortfolios are privatelyowned and the owner has a complete control over who has access to what and when. The Inter/National Coalition for Electronic PortfolioResearch that mostly deals with ePortfolios across the USA, denes ePortfolio as a collection of diverse evidence created in authenticactivity that is brought together and re-contextualized to say something about what I know and can do (how I have grown or changed).and with an added interpretation intended for one or more specic audiences (Cambridge, Cambridge, & Yancey, 2009, p. 145).
It should be noted that the above denitions emphasize the information system aspect of ePortfolios, but do not explicitly address thenature and range of ePortfolio ownership. In addition, the learning aspect of ePortfolios not only embraces the storage and presentation ofpast work and experience, but also encompasses reection and feedback. It is these two latter features that may represent the biggestpotential of ePortfolios as a Lifelong Learning (LLL) tool for growth and development, and should therefore not be overlooked.
The ePortfolio has evolved over the years and Table 1 presents a brief comparison between 2 major generations of ePortfolio denitionsthat can be identied during the ePortfolio development cycle.
Considering the various denitions referred to in this section, one more encompassing denition will be coined for this study toovercome the shortcomings of the previous ones (see Table 1). The proposed denitionwill take into account the ePortfolio purpose, type ofinformation, entities involved and IT component. Therefore for the purpose of this research ePortfolio will be dened as a personal digitalrecord that supports formal, informal and non-formal learning and contains evidence about ones accomplishments in the form of artifacts andreection on learning which can be provided to whomever the owner has chosen to grant permission.
3. ePortfolio as an information system
Several references that indicate the interrelationship between ePortfolio and IS can be found in the literature (Jafari, 2004; Mu et al.,2010; Richardson & Ward, 2005). In his description of ePortfolio, Jafari (2004) approached its development using the IS framework. Mu
Table 1Analysis of existing Portfolio denitions.
1st generation of denitions (Portfolios) Shortcomings:
Meaningful collection of student work that demonstrates progress and/or masteryguided by standards and includes evidence of student self-reection (Paulson,Paulson, & Meyer, 1991).
Student only actor (does not include other possibletypes of actors such as organization or teacher)
IT component missing Ownership issues such as copyright not consideredCollection of student achievement artefacts created during a period of time thatserve as authentic assessment tools used to evaluate student learning (Abrenica, 1996).
Purposeful collection of students work that illustrates efforts, progress andachievement (Barrett, 1998).
2nd generation of denitions (ePortfolio) Shortcomings:
An electronic learning record which enables an individual to store, organize and present their workand accomplishments (Barker, 2003).
Ownership issues not considered Does not include all possible types of user entities(such as potential employer)
Does not include the most important type of supportin learning ((self)reection, feedback, etc.) that makes
A personal digital collection of information describing and illustrating a persons learning, career,experience and achievements (European Institute for E-learning (EIfEL), 2009).the process far more advanced than before
et al. (2010) attempted to conceptualize the functional requirements for ePortfolio systems referring to ePortfolios as consisting of peopleand technology. Although a supercial use of the IT framework for ePortfolios may lead one to consider it merely as an IT tool, an ePortfoliosystem actually comprises much more. As with any other IS, where ePortfolios are concerned, it is not sufcient to merely embrace thetechnology; it has to be adopted and used by people supporting all the required business processes in a proper way. An ePortfolio is a set ofinterrelated or meshed components and functionalities, similar to IS. Therefore, ePortfolio applications should be put in a wider organi-zational context. This approach was taken in the research by Mu et al. (2010) in order to understand ePortfolio functionalities and theirprioritization criteria. In the same paper, the authors discussed the challenges associated with the adoption of ePortfolios drawing from theliterature about IS adoption and assimilation. Furthermore, in their survey conducted in the UK, Richardson and Ward (2005) argued thatePortfolios should support LifeLong Learning. They also reported a signicant discrepancy between ePortfolio applications and therequirements of a LifeLong Learning environment as an organizational system to be supported with ePortfolio.
In Table 2 common attributes extracted from IS denitions and ePortfolio denitions are grouped. It is assumed that similarities indenitions reect similarities between objects.
Considering the common attributes between IS denitions and ePortfolio denitions above (Table 2), the following conclusions can bedrawn:
An ePortfolio is a set of interrelated components at the technical level. It is comprised of a Web application, hardware and softwaresupport and a network infrastructure. These features qualify ePortfolio as IS from the technical perspective.
An electronic learning record established at the technical level supports processes from a business system. It enables users to collect,store, manage, process and disseminate information in the form of an artifact, which occurs at the data level. The purpose of anePortfolio is equivalent to IS.
Providing support to data and having ICT features is not sufcient for an entity to qualify as an IS. ePortfolio fullls its purpose only whenan individual interacts with others by giving and receiving feedback in different forms, that is, when it is used within a community.
Based on a review of the extant literature, ePortfolio implementations must take into account three different stakeholders: individuals
I. Balaban et al. / Computers & Education 60 (2013) 396411398(students and teachers); institutions; and employers. A study of ePortfolios from all different perspectives is beyond the scope of a singleresearch study, and therefore this research will focus on the assessment of ePortfolio deployment from the perspective of individualstudents. The majority of research on ePortfolio (Batson, 2002; Gathercoal, Love, Bryde, &McKean, 2002; Love, McKean, & Gathercoal, 2004;Ring & Foti, 2006; Stefani, Mason, & Pegler, 2007; Stevenson, 2006) focuses on the deployment process within an institution, deningePortfolio requirements, and case studies of institutions that have implemented ePortfolios at the course level. However, . ePortfoliosystem implementation is in general a comprehensive educational innovation and therefore support has to be provided in both a peda-gogical and technical sense (sic) (Ring & Foti, 2006, p.353). Several attempts have been made to investigate the perceived values of users
Table 2Comparison between denitions of IS and ePortfolio.
(1) IS denition (2) EPortfolio denition (3) Common attributes of IS and ePortfolio
A set of interrelated components workingtogether to collect, process, store, anddisseminate information (Laudon &Laudon, 2002).
A personal digital collection of informationdescribing and illustrating a personslearning, career, experience and achievements(European Institute for E-learning, 2009).
a An individual uses several components (other individuals,institutions, network, IT technology) to create a personaldigital collection.
b Within the digital collection, information is collected,processed and stored.
c Illustrating ones career and achievements impliesdissemination in the IS context.
Work (organizational) system whosebusiness process is devoted to capturing,transmitting, storing, retrieving,manipulating, and displaying information(Alter, 2002).
A meaningful collection of student work thatdemonstrates progress and/or mastery guidedby standards and includes evidence of studentself-reection (Paulson et al., 1991).
a Again, a meaningful collection implies the use oftechnology by people or organizations to gather,process and disseminate information.
b To demonstrate progress, mastery or to provide evidenceof reection, the information gathered in a. should beprocessed and disseminated accordingly using technology.
A set of interconnected components thatinvolve hardware, software, people andprocedures and work together to achievesome objective (Lawlor, 1994).
An electronic learning record which enables anindividual to store, organize and present theirwork and accomplishments (Barker, 2003).
a An electronic learning record is a combination of hardwareand software that enables creation, storage and presentationof information.
b In an IS, individuals represent people who use procedures(presenting their accomplishments) to achieve an objective,Therefore, another important aspect of ePortfolio is the people and organizations that use it (either as users and/or as the audience).These are also themost important elements of almost every IS. By recognizing these elements in the ePortfolio context, we can concludethat ePortfolio is equivalent to IS in terms of people, community and organization involvement.
In summary, considering the results of the descriptive analysis, ePortfolio can indeed be perceived as IS since it meets all the ISrequirements. Correspondence between the two is evident at all levels and in all aspects, thus providing sufcient evidence to use the D&MIS success model as the theoretical framework for the study of ePortfolio system success.
4. Electronic Portfolio system successlike getting a better job.
acceptance, but those studies dealt only with users attitudes toward use and perceived usefulness (Tzeng, 2011;Wang &Wang, 2009). Somebroader concepts, such as benets for end-users, or their perceived system and information qualities, have not been taken into account.
By approaching the ePortfolio as an Information System, the updated D&Mmodel (DeLone &McLean, 2003; Petter et al., 2008) as shownin Fig.1 can be used tomeasure the success of ePortfolio system implementation. This means that the success of an ePortfolio can be put intoa wider perspective instead of being constrained solely by its use.
Using the D&MModel, six ePortfolio system success constructs were operationalized in this study: System Quality, Information Quality,Service Quality, Use, User Satisfaction and Net Benets.
(1) System Quality: Measures of the Information Processing System Itself
This dimensionmeasures the desirable characteristics of IS. Since this dimension captures the system itself it is oriented toward technicalspecications like data processing capabilities, response time, ease of use, system reliability, and sophistication. According to DeLone andMcLean (2003), the System Quality construct should measure technical success that Shannon and Weaver (1949) dened as the accuracyand efciency of the communication system that produces information. The most commonmeasure of System Quality is the perceived easeof use related to the Technology Acceptance Model (TAM). However, many researchers, including DeLone and McLean, believe that theperceived ease of use does not capture the construct as a whole (Petter et al., 2008). Therefore, researchers have created their own indices of
I. Balaban et al. / Computers & Education 60 (2013) 396411 399System Quality based on literature review or DeLone and McLeans recommendations (Alberto & Gianluca, 2007; Gable, Sedera, & Chan,2008; Rivard, Poirier, Rayond, & Bergeron, 1997; Wang & Wang, 2009).
In the ePortfolio context: The ePortfolio (Web) application is the system for processing information. Todays ePortfolios are Internet-basedapplications, so this construct measures the desired characteristics of an ePortfolio application (tool) in the Internet environment. Usability,functionality, user interface and security are examples of qualities that are valued by users of ePortfolio application (Doig, Ilisley, McLuckie, &Parsons, 2006; Hickerson & Preston, 2006; OBrien, 2006; Stefani et al., 2007). More specically, the ePortfolio system quality is reected inthe ease of use, availability of help functions, ability of the ePortfolio system to continuously be up-and-running, its ability to providesufciently quick response, and its integration with other on-line tools.
(2) Information Quality: Measures of Information System Output
This construct includes the desirable characteristics of system outputs. The quality of information the system produces, primarily in theform of a report or a web page, is measured. Since DeLone andMcLean developed their IS success model considering Shannon andWeavers(1949) framework, this construct measures Shannon andWeavers semantic success, which is the success of the information in conveying theintendedmeaning. According to Petter et al. (2008) the Information Quality construct has proven problematic to capture andmeasure as it isnot often distinguished as a unique construct. While some researchers used the existing generic scales of Information Quality (Petter et al.,2008), others developed their own scales. Some categories of Information Quality that can be measured are relevance, understandability,accuracy, completeness, usability, and importance. In this research a combination of measures from Fraser & Salter (1997), Roldn and Leal(2003), and Wixom and Watson (2001), were used.
In the ePortfolio context: Information is processed by the ePortfolio application. Outputs present added value to the society and toindividuals themselves (Emmett, Harper, & Hauville, 2006; OBrien, 2006; Stefani et al., 2007). Outputs should be valid, relevant, wellformatted, easy to understand and up-to-date if it is expected that students, teachers or employers will use ePortfolio. Two main types ofinformation are produced in the ePortfolio in conjunctionwith the user: artifacts and views (presentations). Views can also be interpreted asartifacts, therefore, this constructmeasures the quality of views and artifacts produced by the ePortfolio application and the user. The qualityis reected in terms of whether the artifacts can be veried and whether the artifacts or views are concise, readable, and up-to-date. In theePortfolio context this construct was also studied by Doig et al. (2006) and Katerattanakul and Siau (2008).
(3) Service Quality: Measures of Support Provided to the End-User
Except for quality software and satisfactory information, the nature and the extent to which end-users receive support in working withthe system plays a very important role in IS success. Therefore, in this construct the quality of support that system users receive from the ISFig. 1. Updated D&M IS success model.
department and IT support personnel is measured. This construct was added to the updated D&MModel based on previous research of theoriginal D&MModel that identied the need for this construct. The importance of this construct is determined by the context, since ServiceQuality can be of great importance when measuring the success of an IS department, as opposed to that of individual systems. The mostwidely used method for measuring Service Quality is SERVQUAL (Parasuraman, Berry, & Zeithaml, 1988). Possible characteristics of thisconstruct are responsiveness, accuracy, reliability, and technical competence. Since ePortfolio is an on-line based tool and activity, a set ofstatements from E-S-QUAL (Kim, Kim, & Lennon, 2006) and WebQual/eQual (Barnes & Vidgen, 2005) was used to encompass the on-lineservice components such as efciency, interaction, availability, privacy, virtual community and contacts.
In the ePortfolio context: All the means of support for using ePortfolio depend on the context and range from on-line help, manuals andhelp-desk service, and the ability of using the ICT equipment in institutions. Its importance is high since inadequate user support canactually lead to poor use of ePortfolio. Several cases that identied the importance of support with ePortfolios can be found in Doig et al.(2006), Flanigan and Amirian (2006), Hickerson and Preston (2006), etc. Therefore, this construct measures end-users assurance, empathyand clarity. In the ePortfolio-specic environment, service quality measures the individual attention paid to the user by the institution, theavailable means of end-user support, and how well the ePortfolio assessment and usage criteria are described in course requirements.
(4) System Use: Recipient Consumption of the Systems Capabilities
This indicates the degree and manner to which staff and customers utilize the capabilities of IS. Intention to Use and Use are stronglyinterconnected and the authors suggest using Intention to Use as an alternative to Use in some contexts. Although Intention to Use describesan attitude and Use relates to behavior, either of them can be used depending on the context. Some authors have suggested the removal ofthis construct as a success variable because in most research the construct was too trivially dened. Since this is a complex variable, it iscrucial to consider not only the nature of use and but also the frequency of use. Some researchers inaccurately assumed that System Usewasthe most objective and easiest to quantify and tried to interpret the concept by measuring only the frequency of use. Therefore, whenupdating the D&MModel, its authors stressed the importance of this construct and suggested that researchers consider the nature, extent,quality, and appropriateness of the system use (DeLone & McLean, 2003, p. 76). There was also a debate about appropriate measures.Namely, in empirical studies many measures of use were adopted, but in most cases those measures led to mixed results between use andother constructs. Therefore, considerable attention needs to be given to choosing appropriate measures in a specic context. FollowingDeLone and McLeans recommendations (Petter et al., 2008) several instruments were reviewed and used in this research. Some of themwere from Burton-Jones and Straub (2006), Torkzadeh and Doll (1999), Venkatesh, Morris, Davis, and Davis (2003) as well as from the D&MModel itself.
In the ePortfolio context: The purpose of ePortfolio is to support LLL. This construct assesses the degree andmanner inwhich an individualuses the ePortfolio application and realizes its potential and usage for LLL. In terms of ePortfolio, it measures the systems functionalitiesbeing used by the user such as features for organizing the ePortfolio content, joining groups, artifact tagging as well as facilitating conditionsthat are present during the use of ePortfolio. The ePortfolio use was previously studied by Fernndez (2008) who reported high inuence ofePortfolio use on user satisfaction.
(5) User Satisfaction: Recipient Response to the Use of the Output of an Information System
Users level of satisfaction with reports, Web sites, and support services is measured with this construct. The main difference betweenthis concept and the previous one can be identied when system use is mandatory. In this case, User Satisfaction becomes a very usefulconstruct because satisfaction will eventually lead to greater efciency. Use and User Satisfaction are interrelated in both a process andcausal sense. Use precedes User Satisfaction, while greater Satisfaction will lead to greater Use. As in the case of System Use, the mostpopular measures for this construct also contain items related to other constructs. This is due to the fact that these measures wereoriginally designed to measure different categories, but many researchers simply adopted them and applied them to the D&M Model.Therefore, some researchers parsed out elements that do not measure this construct or used their own scales. In this research, a part ofVenkatesh et al.s (2003) UTAUT instrument concerning the attitude toward using technology, facilitating conditions and anxiety wasused.
In the ePortfolio context: This construct assesses user satisfaction with the ePortfolio application and the information produced by thatapplication. Users satisfaction with views, artifacts and feedback received will probably lead to a greater use of the ePortfolio as anapplication and a concept. The attitude toward using the system and its usefulness are considered to be two of the most important elementsof User Satisfaction in the ePortfolio context. Therefore, this construct measures whether the ePortfolio system makes work more inter-esting and motivates users to learn, whether all necessary resources are met in order to use ePortfolio, and whether an individual has theknowledge to work with the ePortfolio. Fernndez (2008) investigated student learning ePortfolios and reported a strong positive impactbetween user satisfaction and the use of ePortfolio.
(6) Net Benets: The Effect of Information System on Specic Contextual Levels
The extent to which IS contributes to the success of individuals, groups and other stakeholders is represented as Net Benets. In theoriginal model the term impact was used to describe the effect of IS on individuals and/or groups. Over the years, in the course ofimplementation of the D&MModel it has become clear that individual and group impacts are not sufcient to measure success. In the lightof those ndings, rather than complicate the model with more impact categories and measures, its authors decided to group all themeasures into a single category Net Benets. Depending on the level of study and the context, a ner granularity may be needed in order todistinguish and address sub-categories of benets specic to the level of analysis and the observed context. This is the only construct that iscase specic and depends entirely on the type of IS. This means that characteristics of e-commerce systems such as improved decision-making, improved productivity, market efciency, and cost reductions that DeLone and McLean (2003) analyzed with the D&M Model
I. Balaban et al. / Computers & Education 60 (2013) 396411400may not be applicable to other IS domain such as ePortfolio. On the contrary, it is a rather complex construct that requires awhole new set of
measures and characteristics to be developed for a specic type of the IS domain. Researches from Blackburn and Hakel (2006), Gathercoalet al. (2002), Hickerson and Preston (2006), Kim et al. (2006), Marcoul-Burlinson (2006) and Helen Barrett3 that describe benets fromePortfolio were also considered in capturing this construct.
systems in general can be applied to an ePortfolio. Therefore, besides potential relationships found in the ePortfolio literature several
I. Balaban et al. / Computers & Education 60 (2013) 396411 401assumptions from the D&M Model were also used to hypothesize relationships in the proposed ePortfolio success model at the individual(student) unit of analysis. Hypotheses about relationships in the model, with corresponding discussions, are presented below.
H1: Electronic Portfolio system quality has a positive effect on use of ePortfolio.
Wang and Wang (2009) analyzed the effects of system quality on use and reported that the former construct inuenced the latterthrough perceived ease of use. Petter et al. (2008) also reported a possible positive inuence of system quality on use. In themeta-analysis oftheir own research model, Sabherwal, Jeyaraj, and Chowa (2006) showed a signicant positive relationship between the two constructs.
H2: Electronic Portfolio system quality has a positive effect on user satisfaction with ePortfolio.
Both Lin (2007) andWang andWang (2009) reported that system quality positively affects user satisfaction. Petter et al. (2008) analyzed21 papers in IS studies that addressed the relationship between system quality and user satisfaction. All conrmed the existence of sucha relationship. In the meta-analysis of their own research model, Sabherwal et al. (2006) showed a signicant relationship between the twoconstructs.
H3: Electronic Portfolio information quality has a positive effect on use of ePortfolio.
Wang and Wang (2009) reported an indirect positive inuence of the information quality construct on the intention to use, and,eventually, the system use construct, which in the D&M IS successmodel are contained in the use construct. Similarly, Lin (2007) establishedthat information quality directly inuences intention to use an on-line learning system and indirectly affects its actual use.
3 Dr. Helen Barrett has been researching strategies and technologies for electronic portfolios since 1991, publishing a Website (http://electronicportfolios.org), chapters inseveral books on electronic portfolios, and numerous articles. She has been providing training and technical assistance on electronic portfolios for teacher educationIn the ePortfolio context: This is themost comprehensive and delicate construct because it is specic for every context but at the same timeit provides an opportunity to recognize specics of each ePortfolio. It needs to be developed separately for each type of IS because it capturesthe contribution of a specic type of IS to different target groups. This construct measures the extent to which ePortfolio enhances LLL. Someof the key aspects of Net Benets that concern the individual are enhanced learning through developing a positive attitude to LLL, achievinglearning outcomes, increased transparency in evaluation, and enhanced communication between student and teacher. The other importantaspects of Net Benets for an individual can be seen through personal growth and development in terms of evaluating ones progress towardthe achievement of personal goals, the ability to choose co-workers, benchmarking employability, and self-employability. At the same time,based on the information from ePortfolio, institutions can show their particular strengths and advantages or re-organize employees intoproject teams based on their interests, skills and work experience and advance work efciency. Moreover, employers can benet fromePortfolio in the recruitment process by, for instance, narrowing the list of potential employees based on the information provided in theirePortfolios. Students can also benet in this respect by enhancing their learning and managing their own growth and development. Manyexamples of ePortfolio benets can be found in Barret (1998), Batson (2002), Bisovsky and Schaffert (2009), Fernndez (2008), Gathercoalet al. (2002), Hickerson and Preston (2006), and Stevenson (2006).
Petter et al. (2008) reviewed and analyzed over 90 empirical studies in which the D&M model was used in different contexts, althoughnone of them could be applied to ePortfolio. However, there are several examples inwhich the D&Mmodel was used to assess the success ofon-line systems. Lin (2007) provides an example of applying the D&M model to measure on-line learning systems success. Signicantcorrelations among all the constructs of the model were established. This means that all the constructs and their interrelationships areimportant for the success of on-line learning systems. Similar research was conducted by Chen (2010), who used the D&M model todetermine the link between employees e-learning system use and their overall job outcomes.
In this paper, an instrument to assess ePortfolio success at the individual student level of analysis and the corresponding ePortfoliosuccess model (based on the D&M model and the ePortfolio extant literature) will be developed.
5. Research model
In this section, the rationale for each of the proposed hypotheses, stating connections between constructs from the proposed ePortfoliosuccess model (Fig. 2) is argued. For this purpose, over 50 ePortfolio papers were analyzed in which instances of ePortfolio usage, imple-mentation, and development were reported. There were only a few examples reporting on analysis of possible causal relationships amongsome factors that can be identied in the process of ePortfolio usage and related to the D&Mmodel (Doig et al., 2006; Katerattanakul & Siau,2008; Lpez Fernndez & Rodrguez Illera, 2009). Similarly, Lpez Fernndez and Rodrguez Illera (2009) also reported that there is verylittle literature on this type of eportfolio and few studies addressed at nding out the impact on students from their perspective. Similarly,the research results related to well-known IS success models (such as the D&MModel and Gable et al.s Model) do not mention the usage ofsuch models in the ePortfolio context (Gable et al., 2008; Petter et al., 2008; Seddon, Staples, Patnayakuni, & Bowtell, 1999). For this reason,the development of the ePortfolio measures required extensive work.
Since an ePortfolio is also a Web-based information system, relationships determined for Web-based systems and on-line informationprograms throughout the U.S. under a federal PT3 grant for many years. At the European ePortfolio Conference in Maastricht, October 2007, Dr. Barrett received the rst EIFELLifetime Achievement Award for her contribution to ePortfolio research and development. More information can be found at http://electronicportfolios.org/.
H4: Electronic Portfolio information quality has a positive effect on user satisfaction with ePortfolio.
Wang and Wang (2009) argued that information quality has a direct positive effect on perceived usefulness, part of the user satisfactionconstruct. Lin (2007) supported the relationship between information quality and user satisfaction in the context of learning systems. Petteret al. (2008) agreed that a strong support for information quality inuencing user satisfaction exists, noting that 15 of 16 papers in ISresearch reported the existence of such a relationship.
H5: Electronic Portfolio information quality has a positive effect on net benets.
Katerattanakul and Siau (2008) conducted research about factors that inuence the information quality of ePortfolios, establishing thatinformation quality has a signicantly positive inuence on the nal benets. Doig et al. (2006) also reported the importance of use of theinformation generated in ePortfolio for further growth and development as one of the net benets. The positive inuence of informationquality on net benets was also conrmed by Petter et al. (2008).
H6: Electronic Portfolio service quality has a positive effect on the use of ePortfolio.
Lin (2007) argued that service quality positively inuences intention to use and indirectly leads to the actual use of on-line learningsystems. Wang and Wang (2009) also reported a direct inuence of service quality on perceived ease of use, where perceived ease of use ispart of the use construct.
H7: Electronic Portfolio service quality has a positive effect on user satisfaction with ePortfolio.
Fig. 2. Proposed ePortfolio success model.
I. Balaban et al. / Computers & Education 60 (2013) 396411402Wang and Wang (2009) reported a direct positive inuence of service quality on perceived usefulness of the system, where perceivedusefulness of the system is part of the user satisfaction construct. Alberto and Gianluca (2006) showed a similar effect, indicating thattraining and support directly and positively inuence user satisfaction. Lin (2007) also reported a signicant relationship between servicequality and user satisfaction.
H8: The use of ePortfolio has a positive effect on user satisfaction.
Wang and Wang (2009) found that self-efcacy as part of the use construct has, in this case, positive effects on user satisfaction. LpezFernndez and Rodrguez Illera (2009) reported a high positive inuence of the use of a digital course ePortfolio on user attitudes andsatisfaction, both of which are included in the user satisfaction construct in this research. The research results alongwith the little literaturerelated to academic eportfolios, are showing a clear effect on students attitudes and beliefs, which is affecting their self-efciency duringa semester (Lpez Fernndez & Rodrguez Illera, 2009). In measuring IS success, Petter et al. (2008) noted that in four of ve papers, usedirectly inuences user satisfaction.
H9: Electronic Portfolio user satisfaction has a positive effect on use of ePortfolio.
Wang and Wang (2009) established that greater user satisfaction leads to greater intention to use and, eventually, greater use. Albertoand Gianluca (2006) supported this nding by proving that facilitating conditions, as part of user satisfaction, inuence use. Lin (2007)reported the inuence of user satisfaction on use in the on-line learning systems context as one of the strongest relationships in hismodel. In measuring IS success, Petter et al. (2008) noted that 17 out of 21 papers provided evidence for the effect of user satisfaction on use.
H10: Use of ePortfolio has a positive effect on net benets.
Lpez Fernndez and Rodrguez Illera (2009) investigated the effect of student learning on ePortfolios, reporting a strong positive impactof use on students opinions and on enhancing their learning as part of net benets indicating that students academic achievement couldbe in part enhanced through the use of a digital learner portfolio in undergraduate and graduate university courses (Lpez Fernndez &Rodrguez Illera, 2009). Burton-Jones and Straub (2006) and Petter et al. (2008) indicated the existence of a relationship between useand net benets.
H11: Electronic Portfolio user satisfaction has a positive effect on net benets.
In measuring Web-based IS success, Alberto and Gianluca (2006) showed that user satisfaction directly and positively inuences netbenets. Petter et al. (2008) found a very strong positive relationship between user satisfaction and net benets. In their analysis, all 14papers reported a positive relationship between the two constructs.
I. Balaban et al. / Computers & Education 60 (2013) 396411 403H12: Electronic Portfolio net benets have a positive effect on user satisfaction with ePortfolio.
Petter et al. (2008) provided evidence for the existence of a very strong positive relationship between net benets and user satisfaction,noting that all 11 papers encompassed by their analysis conrmed the existence of such a relationship.
Based on these hypotheses, we propose the research model shown in Fig. 2.
6. Measurement instrument development
In order to develop a measurement instrument with good psychometric properties, we followed the instrument creation processsuggested byMoore and Benbasat (1991). In addition to following the traditional instrument development paradigm, we also followed someother guidelines and examples of instrument development typical for IS research (Armstrong & Sambamurthy, 2000; Lewis, Templeton, andByrd, 2005; Straub, Boudreau, & Gefen, 2004).
Table 3 shows phases in the instrument development process and the initial reliability tests. First, the initial pool of items related toePortfolio was created and content validity of the instrument was veried. For the purpose of operationalization of the research constructs,a literature review, existing measurement items, and new measurement items proposed by ePortfolio experts were used. Eighteen expertsfrom nine different countries (Austria, Croatia, New Zealand, Poland, Russia, Slovenia, Spain, the United Kingdom, and the United States)veried content validity. Their roles and area of expertise could be divided into three categories: institution representatives (experts inimplementing ePortfolio at the institution level), educators (experts in using ePortfolio in teaching) and students (primarily experienced inusing ePortfolio in learning and for self-presentation). This ensured that all users levels were adequately represented. Their task was toscore the 175 items using the scale 0 Cannot answer, 1 Not relevant, 2 Important (but not essential), and 3 Essential. The work ofLewis et al. (2005) was followed here since they utilized responses of both important (but not essential) and essential, with the expla-nation that both of them were positive indicators of the items relevance to ePortfolio. Based on the table in Lawshe (1975) the ContentValidity Ratio (CVR) for each item was evaluated for statistical signicance (0.05 alpha level). Statistical signicance meant that more than50% of the panelists rated the item as either important or essential. Items that were not signicant at the level of 0.05 were dropped. Inaddition, the mean CVR across the items was calculated as an indicator of the overall test content validity. The minimum value provided inLawshe (1975) for 16 panelists is 0.48. In this research, the calculated mean CVR was 0.78, which indicated that the agreement amongpanelists was unlikely to have occurred accidentally.
In the next step, all evaluation sheets were thoroughly analyzed again, but this time qualitatively. Based on the panelists comments,redundant and ambiguous statements were excluded and some statements were modied according to panelists suggestions. Since theamount of statements was quite comprehensive, the panelists did not recommend including any additional statements that might havebeen missing from the instrument. However, they recommended that, for consistency sake, all the statements should be written in the rstperson, and therefore some of the statements were modied accordingly.
In order to ensure that the items represented the six constructs from the D&Mmodel, construct validationwas conducted, using acceptedprocedures described by Chang and King (2005), Davis (1986, 1989), Moore and Benbasat (1991), Segars and Grover (1998), and Straub et al.(2004). To assess the reliability of the sorting procedure, two different measures were used: Cohens Kappa inter-rater coefcient and theitems placement procedure described in Moore and Benbasat (1991). Three rounds of Q-sorting were conducted to ensure convergent anddiscriminant validity of constructs. Cohens Kappa was above 0.70, which is the threshold suggested by Straub et al. (2004). The itemsplacement ratio was above 65%. These showed an acceptable level of reliability of the sorting procedure. The rst two rounds involvedteachers, administrators and students. The third and nal round of the Q-sort procedure involved only students, ensuring that the remaininglist of items was applicable to the student population. In the process of reducing the number of items, attention was paid to ensure that
Table 3Results of content validity, construct validity and initial reliability of the scales.
Constructs Number of items Alpha
Initial After CVR After Q-sort (1st round) After Q-sort (2nd round) After inner construct Q-sort (3rd round)
System Quality 47 43 43 39 19 0.82Information Quality 13 12 12 9 9 0.86Service Quality 22 19 14 9 9 0.86Use 30 9 7 6 8 0.56User Satisfaction 12 9 9 9 6 0.88Net Benets 51 40 22 13 11 0.89
Total 175 132 107 85 62
comprehensiveness was not sacriced. Since the number of items in the rst round was quite large (175), the items with no agreementsbetween the judges that still scored as essential for ePortfolio in CVR were retained for the second round of Q-sort. In the rst round of Q-sort, the same experts that did the CVR were used. The second round of Q-sorting was performed at two universities: Carlow University inPittsburgh and University of Zagreb, Faculty of Organization and Informatics Varazdin (FOI). For that purpose, a moderator was used at eachUniversity. The third round was conducted at FOI only and was aimed at gaining insight into possible sub-categories within the constructsand further renement of the statements. The identied sub-categories still needed empirical testing, and for this purpose factor analysiswas used after the eld-test results were obtained.
After completing the above steps, the instrument consisted of 19 items in system quality, nine items in information quality, nine items
I. Balaban et al. / Computers & Education 60 (2013) 396411404in service quality, eight items in use, six items in user satisfaction and 11 items in the net benets construct. All items were measured usinga ve point Likert-type scale from 1 (I dont agree/totally incorrect) to 5 (I totally agree/totally correct). Questions about general back-ground were also included. A pretest made with ten undergraduate students who had experience with ePortfolio usage elicited feedbackabout visibility, clarity, readability, and time needed for completion. A pilot test provided deeper insight into typical and possibleanomalous responses, as well as potential problems with statistical analyses. An initial reliability assessment of the scales was also per-formed. All scales exhibited good reliability (Cronbachs Alpha above 0.80). At this point, the instruments were deemed ready for the fullscale eld study.
7. Data collection
The following participant selection strategy was used:
1. On-line research was conducted to collect a list of as many institutions as possible in Europe and USA that might have experience inimplementing and using ePortfolio. Based on the results of the search, 72 institutions that reported any kind of experience withePortfolio were targeted.
2. An invitation letter was mailed to an institution representative such as the dean, director or manager. If the institution decided toparticipate, another letter was sent to students in agreement with the institution authority. Prior to this correspondence, the studentswere e-mailed by the institution and informed that theywould get the invitation letter to join the survey. The e-mails sent to students indifferent institutions were not identical in their structure because each institution had its own regulations and requirements.
3. If no reply was received from the institution authority, a gentle reminder was sent twoweeks after the rst e-mail. The same procedurewas used for students if the response rate was low.
Two instruments for analysis were administered in this research. The rst one, which measured ePortfolio success, was a self-administered survey intended for students since the unit of analysis was individual. The second instrument was intended for institu-tions in order to collect some general data. The data collection process started in June 2010 and ended in November 2010. At the end of theprocess, 28 different institutions worldwide completed the survey that was collecting general information about the institution, while 248students completed the ePortfolio success survey. Careful screening of the responses in the latter survey showed that 62 user responseswere not usable since 54 users reported using ePortfolio in only one course and 8 of the users answers were not valid (for example, studentsprovided identical answers to all the questions). Therefore, 186 valid students responses were left for further analysis. The response rate forinstitutions that participated in the surveywas 42%. However, calculating student responses to the ePortfolio success surveywasmuchmoredifcult because almost none of the institutions that agreed to participate in this research and complete the institutions survey reported theexact number of targeted students. Therefore, this research ended when it was determined that there were enough data points in theePortfolio success survey to perform the analysis and no institutions were left in the pool of potential participants. Table 4 shows thegeographical dispersion of the institutions as well as the number of institutions responses and the corresponding students responses. Sincesome universities wanted to remain anonymous, only general data is displayed.
The respondents were 55.4% female and 44.6% male, a balanced gender sample. Most students (80.6%) fell into the 1823 age range, thetraditional university student population. The targeted population was comprised of students that had used ePortfolio in at least twocourses or for a period of at least six months. Additionally, students used different ePortfolio software (in most cases they used Mahara andcustom made ePortfolios), which was very important. If all students had used the same ePortfolio software, the relationship between thesystem quality construct and the other constructs would have not been able to be determined. Additionally, a very large number ofrespondents (94.6%) reported having from a few up to 20 artifacts in their ePortfolios. Most respondents (38.7%) had between 5 and 10artifacts stored in ePortfolio. Most of them had their CVs in the form of a single artifact. Most respondents (69.4%) had used ePortfolio in onlytwo courses. In addition, most students used ePortfolio on a monthly basis (59.2%), and a respectable number used ePortfolio at least oncea week (35.5%). This is understandable because producing artifacts is not an everyday activity. An average ePortfolio user therefore tends touse the system at least once aweek, and according to the data obtained in this research, most respondents fell into that category. All studentswere using ePortfolio for self-presentation (CV) and assessment within courses. A few students reported the usage of ePortfolio for PDP, butoutside course assignments.
Table 4Demographic structure of respondents.
Region No. of universities No. of participants
Central Europe 4 120Eastern Europe 1 19Western Europe 3 33USA 2 24
TOTAL 10 186
8. Data analysis and results
In general, co-variance-based SEM (LISREL) was used to examine model t for each construct and its subconstructs (to assess themeasurement model). Variance-based SEM (Smart PLS) was used to test the relationships among the constructs (to test the hypotheses inthe structural model).
8.1. Measurement model validation
Because the relatively small sample (N 186) yielded a subject-to-item ratio of 3:1, PLS, rather than co-variance-based SEM, was used todetermine whether all the items loaded on their prospective constructs (Mahmood, Bagci, & Ford, 2004). To ensure proper convergentvalidity, and in order to follow accepted guidelines in the IS eld, only the items that loaded over 0.6 into their prospective construct wereretained (Gefen & Straub, 2005; Segar, 1997; Straub et al., 2004). Unidimensionality was established by eliminating the items that cross-loaded on more than one construct (that is, items over 0.5 on at least two constructs). The bootstrap method with 300 samples wasconducted to obtain t-statistics. The minimum t-value was 3.65, while all the other values were much higher, indicating that the factorloadings were signicant at p 0.001. Next, exploratory factor analysis was used to determine the number of subconstructs in eachdimension (Gefen, Straub, & Boudreau, 2000; Straub et al., 2004). Items with loadings below 0.50 were dropped (Chang & King, 2005; Gefen& Straub, 2005). Table 5 shows the factor structure for each construct.
8.2. Adjusting the measurement model t
After identifying the subconstructs (factors) within each dimension, a CFA in LISREL was used again to examine the measurement modelt for each of the subconstructs, and nally for the constructs as awhole. The measured factors were rst modeled in isolation, then in pairs,and nally as a collective network, representing the whole construct, as suggested by Segars and Grover (1998).
All constructs showed satisfactory t except for user satisfaction, where AGFI and RMSEA were not in the desired range. Therefore, twoadditional model t indices were calculated to get a better insight into the t of this specic construct: NFI 0.93 and SRMR 0.049. Sinceboth indicators showed a good model t (NFI > 0.9; SRMR < 0.05), it was decided that the construct, overall, shows a good t. No furthermodications were made. The model t indices for each construct are presented in Table 6.
After unidimensionality and convergence had been achieved for all factors, factor names were assigned and the reliability of the
SYSQ5 0.649 0.203 IQ7 0.824 0.297 SERQ5 0.769a
NB7 0.769 0.262NB8 0.694 0.203
I. Balaban et al. / Computers & Education 60 (2013) 396411 405NB9 0.673 0.330NB10b 0.566 0.509NB11 0.577 0.315
Variance explained: 68.1% Var. explained: 65.7% Variance explained: 65.1%
Bolded values mark the highest loading of the item on its prospective factor.Method of extraction was Common Factor Analysis with Varimax rotation.
aSYSQ6 0.669 0.416 IQ8 0.368 0.485 SERQ6 0.771SYSQ7 0.187 0.774 IQ9 0.604 0.231 SERQ7 0.755SYSQ8 0.255 0.638 SERQ9 0.658SYSQ10 0.424 0.592Variance explained: 62.7% Var. explained: 68.4% Var. explained: 58.0%
Construct Use User Sat. Net Benets
Factors 1 2 1 1 2
Items and loadings U1 0.556 0.260 US1 0.817 NB1 0.249 0.809U2 0.599 0.136 US2 0.757 NB2 0.271 0.762U4 0.865 0.149 US3 0.856 NB3 0.322 0.609U5 0.698 0.208 US4 0.760 NB4 0.302 0.712U6 0.218 0.708 US5 0.759 NB5 0.419 0.602U7 0.169 0.770 US6 0.649 NB6 0.701 0.346measures was assessed in PLS by calculating Composite Reliability (CR) and Average Variance Extracted (AVE), as shown in Table 7. All thefactors showed good reliability.
The nal measurement instrument is presented in the Appendix.
Table 5Factor pattern for each construct (N 186).
Construct System Quality Inf. Quality Service Quality
Factors 1 2 1 2 1
Items and loadings SYSQ1 0.609 0.421 IQ2 0.190 0.765 SERQ1 0.704SYSQ2 0.522 0.360 IQ3 0.208 0.770 SERQ2 0.652SYSQ3 0.723 0.141 IQ5 0.262 0.605 SERQ3 0.677SYSQ4 0.634 0.318 IQ6 0.850 0.218 SERQ4 0.777Item with a lower loading (below 0.5) that was retained.b Item that cross-loaded on two factors was dropped.
8.3. Structural model testing
Different t indices needed to be used to demonstrate model t because PLS, when used to test the structural model, provides lessexhaustive data about the model t than LISREL. Henseler, Ringle, and Sinkovics (2010) suggest using the coefcient of determination (R2) ofthe endogenous latent variables as the essential criterion for structural model assessment. Another measure for assessing a structural modelis prediction relevance (Q2), which explains the models predictive capability. Tenenhaus, Esposito Vinzi, Chatelin, and Lauro (2005) suggesta global Goodness-of-Fit index (GoF) to be used in PLS as an alternative to a series of t indices used in SEM. This index is employed to judgethe overall t of themodel. The fourth criterion for the structural model assessment used in this research is estimates for path coefcients or
I. Balaban et al. / Computers & Education 60 (2013) 396411406regression weights, known as standard beta coefcients.The analysis began by examining themodel quality coefcients and the paths among the constructs in the structural model. Coefcient of
determination (R2) indicated a substantial level for net benets (0.72), and moderate levels for use (0.48) and user satisfaction (0.53).However, taking into consideration signicant paths that explain the use and user satisfaction constructs, moderate R2 was acceptable forthose constructs. According to Henseler et al. (2010), if an endogenous latent variable is explained by only a few (i.e. one or two) exogenouslatent variables, moderate R2 is acceptable. However, if an endogenous latent variable is explained by several exogenous variables, R2
should exceed the 0.67 cutoff for the substantial level.Prediction relevance (Q2) for all endogenous constructs was above zero, showing that the observed values werewell constructed and that
the model had predictive relevance. GoF value was 0.56, which could be evaluated as an acceptable overall t. For example, Karim (2009)reported GoF of 0.37 as acceptable.
The analysis of beta coefcients and the bootstrap procedure (cases 150, samples 500) showed the following:
1. System quality has a signicant positive effect on use of the ePortfolio.2. Information quality has a positive effect on net benets.3. Service quality has a signicant positive effect on both use and user satisfaction.4. Use has a signicant positive effect on user satisfaction.5. User satisfaction has a positive effect on net benets.6. Greater user satisfaction is related to greater use of ePortfolio.7. Net benets have a positive effect on user satisfaction.8. All the paths from the rst model remained signicant, except for the one leading from service quality toward use.
The remaining relationships in the model were evaluated as non-signicant. Table 8 summarizes hypotheses testing.Two additional criteria for assessing structural models in PLS can be found in the literature. Henseler et al. (2010) and Karim (2009) stress
the signicance of effect size (f2) and the relative impact of the structural model on the observed measures for a latent dependent variablethat is evaluated by means of q2. According to Henseler et al. (2010), f2 values of 0.02, 0.15, and 0.35 signify small, medium, and large effectson the structural level, respectively. The same author describes q2 values of 0.02, 0.15, and 0.35, respectively, as revealing a small, medium, orlarge predictive relevance of a certain latent variable, thus explaining the endogenous latent variable under evaluation.
Both f2 and q2 values were calculated for signicant paths in the model and are presented in Table 8. It is evident that the paths leadingfrom user satisfaction to net benets and vice versa have a large effect size and medium predictive relevance. Further, the path leading fromsystem quality to use has a medium effect size and small predictive relevance. All other paths have both a small effect size and predictiverelevance.
In addition to the effects and relevance of the paths, several authors, such as Haenlein and Kaplan (2004), Henseler et al. (2010) andKarim (2009) recommend examining signicant indirect effects, as well as direct effects, to gain insight into possible moderating ormediating effects of particular latent variables. Indirect effects can be calculated as a product of direct paths (Loehlin, 2004). For example,system quality has an indirect effect on user satisfaction through the use construct. This particular indirect effect can be subsequentlycalculated as a product of beta weights of direct paths leading from system quality to use and from use to user satisfaction. Table 9 showsonly signicant direct and indirect effects (beta values) in the model.
A detailed analysis of indirect effects in the nal model leads to several conclusions:
1. User satisfaction can be identied as an important mediating variable because all constructs in the model affect other constructsthrough this variable.
2. Use mediates the relationship between system quality and user satisfaction, as well as between service quality and user satisfaction.3. Information quality has a strong indirect effect on user satisfaction and a noticeable indirect effect on use.4. Service quality also has a strong indirect effect on net benets and a minor indirect effect on use of ePortfolio.5. System quality has a noticeable indirect effect on user satisfaction.
The remaining indirect effects shown in Table 9 are weak but not negligible.
Table 6Measurement model t indices.
System quality Information quality Service quality Use User satisfaction Net benets
c2(df) 45.76(24) 15.64(12) 25.16(17) 10.08(8) 27.41(6) 41.88(31)RMSEA 0.070 0.040 0.051 0.053 0.197 0.044GFI 0.95 0.96 0.97 0.96 0.91 0.96AGFI 0.90 0.91 0.93 0.91 0.68 0.92
CFI 0.98 0.98 0.99 0.98 0.94 0.99
Table 7Reliability of factors in each construct.
I. Balaban et al. / Computers & Education 60 (2013) 396411 4079. Discussion
This research has addressed the problem of measuring the success of an ePortfolio system deployment at the individual level. For thispurpose, an ePortfolio success measurement model was developed, and based on eld tests a structural model was developed to study therelationships among the dimensions of the proposed ePortfolio success model.
9.1. Using the ePortfolio success instrument
The ePortfolio success instrument addresses the question How successful is our ePortfolio implementation from the end-usersperspective? Using this instrument, an educational institution can nd out, for example, whether the ePortfolio system needs improve-ment or whether the institution needs to raise the quality of services for its students. In other words, the ePortfolio success instrumentallows educational institutions to pinpoint specic areas that need improvement. Moreover, they can use the instrument as a benchmarkingtool with respect to past performance or in relation to the performance of other institutions. For example, the survey could be administeredprior to implementing changes in the ePortfolio system, and a follow-up survey could take place three to six months after changes weredeployed in the ePortfolio system.4 Such evaluations allow improvements or degradation in ePortfolio implementation and usage to bedetected.
9.2. Applying the ePortfolio success model
The principal aim of the ePortfolio success model, revised based on the PLS conrmatory test, is to show the effects its dimensions haveon each other (as shown in Fig. 3). Furthermore, based on the established relationships, institutions can assess which dimensions needattention in order to improve the success of ePortfolio implementation and usage. For example, if the instrument pinpoints user satisfactionas a problematic dimension, the institution can realize that it should rst try to improve service quality and possibly system quality toimprove the benchmark of user satisfaction. Higher user satisfactionwill lead to higher net benets that will, in turn, affect user satisfaction.
Factor number Constructs and factor names CRb AVEc
System QualityFactor 1 Usability 0.89 0.58Factor 2 Functionality 0.86 0.68
Information QualityFactor 1 Validity 0.88 0.70Factor 2 Format 0.88 0.64
Service Qualitya 0.92 0.58Use
Factor 1 Deep Structure Usage 0.85 0.65Factor 2 Facilitating Conditions 0.79 0.57
User Satisfactiona 0.92 0.65Net Benets
Factor 1 Enhanced Learning 0.91 0.67Factor 2 Personal Growth and Development 0.90 0.64
a A single factor construct.b CR > 0.7.c AVE >0.5.Moreover, the proposed ePortfolio success model clearly shows that use and user satisfaction dimensions are very tightly connected. Thismeans that use affects user satisfaction and vice versa. Such relationships are in accordance with the argument that the six dimensions areinterrelated, showing not only process ows but also causality ows (DeLone & McLean, 2003). The same tight connection applies to netbenets and user satisfaction dimensions.
Another interesting nding is that information quality does not directly affect use or user satisfaction. Instead, it was shown that theinformation quality dimension affects net benets that, in turn, affect user satisfaction and use. This is consistent with expectations, becauseinformation is a product of use and is in direct service of benets that arise from ePortfolio usage. Students use ePortfolio to produceinformation about themselves or to be assessed based on the produced information, and therefore, information from ePortfolio is a directproduct that can be used to trigger other activities in LifeLong Learning. The quality of the produced information in ePortfolio actuallypresents an added value to the individual and to society. Based on the information produced in ePortfolio, the individual sees the benets interms of enhanced learning and personal growth and development. These benets lead to satised users of ePortfolio, which will lead togreater use of ePortfolio.
9.3. Implementing an eP success review system
The implementation of an eP success review process is notmuch different from that of an enterprise-wide IS. In both cases the IS needs tobe fully assimilated, that is routinized into the organizational (in this case educational) processes, to provide expected benets (Armstrong &
4 This three-month interval is necessary to allow students to perceive changes in service quality, system quality, or other areas and to express higher satisfaction or noticebenets.
Table 8Summary of hypotheses testing.
Hypothesis Relationship t-value b-value Result f2 q2
H1 System Quality/ Use 7.29** 0.54 Supported 0.27 0.08H2 System Quality/ User Satisfaction 0.38 0.04 Rejected H3 Information Quality/ Use 0.51 0.04 Rejected H4 Information Quality/ User Satisfaction 1.44 0.14 Rejected H5 Information Quality/ Net Benets 3.37** 0.63 Supported 0.12 0.03H6 Service Quality/ Use 2.41* 0.23 Supported 0.03 0.02H7 Service Quality/ User Satisfaction 4.87** 0.47 Supported 0.08 0.05H8 Use/ User Satisfaction 2.40* 0.20 Supported 0.04 0.02H9 User Satisfaction/ Use 2.51* 0.22 Supported 0.04 0.01
I. Balaban et al. / Computers & Education 60 (2013) 396411408Sambamurthy, 2000). Given that this stage, also called the onward upward phase of the system life cycle, may typically take from 6 monthsto 1 year after initial implementation (Markus & Tanis, 2000), an assessment at the end of each academic semester seems advisable the rstyear. From then on and assuming satisfactory outcomes, the survey instrumentmay be administered on an annual basis. Also, althoughmostof the survey items are context-free, there are some (e.g. eP System Use) that may be context-sensitive. For this purpose, when assessing theePortfolio success, it may be convenient to compare the eP success assessment within similar eP use groups. For example, if we assume thereis a heavy use of ePortfolios by humanities students in comparison to other disciplines, it may be better to compare the opinion of ePortfoliostudent users from the humanities separate from other groups. The nature of the different groups to be assessed will be specic to eachinstitution, but separating groups into at least three categories corresponding to extensive use, average use, and little use would seemreasonable.
10. Limitations and future research
Although the number of data points was sufcient for the individual level of analysis, a larger number of individual participants andparticipating institutions would have made it possible to theorize and explore critical success factors and challenges for electronic portfolioassimilation at the organizational level (Mu et al., 2010). On the other hand, the number of data points was sufcient to analyze and validatethe ePortfolio success model. The low response rate, although a cause for concern, can be attributed to two factors. First was the need tosurvey the students through administrative channels, which made survey follow-up difcult. Second, the pool of students with the requiredePortfolio experience is small simply because ePortfolio is not widely used.
With respect to previous research, ePortfolios as previously mentioned have different users, mainly students, administrators, andpotential employers. The logical extension of this research would be to develop, using the student instrument as a reference, specicinstruments for administrators and potential employers. A second line of future research could focus on developing a set of case studies ofthe ePortfolio review process following the guidelines suggested for this purpose in the previous section. This would allow the developmentof best practices when performing the ePortfolio success assessment. From a pedagogical point of view, it would be very useful to exploreunder which conditions the use of ePortfolio has the most positive impact on achievement of learning outcomes in different learningenvironments.
This study proposes a D&M IS-based model to measure the success of ePortfolio system deployment in academic institutions. From
H10 Use/ Net Benets 1.52 0.13 Rejected H11 User Satisfaction/ Net Benets 5.42** 0.55 Supported 0.68 0.16H12 Net Benets/ User Satisfaction 6.24** 0.63 Supported 0.61 0.23
*p < 0.05, **p < 0.001; t-values are calculated by the bootstrap with 150 cases and 500 samples; Q2 used in q2 was calculated by blindfolding procedure with the omissiondistance parameter set to 73.a theoretical point of view, this study extends the use of the D&M Model to the context of educational ePortfolios, taking the view that anePortfolio is an information system. More importantly for the practitioner, the ePortfolio success model proposed in this research addressestwo practical questions. The rst question asks how successful the ePortfolio deployment is (as perceived by the end-users) and the secondasks where the improvement process should focus.
The nal structural model has six dimensions (System Quality, Information Quality, Service Quality, Use, User Satisfaction, Net Benets)that are interrelated directly or indirectly. It was shown that System Quality has a signicant positive inuence on Use of ePortfolio, that
Table 9Effects of variables (beta values*) in the nal structural model.
Use User satisfaction Net benets
Direct effects Indirect effects Direct effects Indirect effects Direct effects Indirect effects
System Quality 0.54 0.11 0.06Information Quality 0.09 0.40 0.63 Service Quality 0.23 0.06 0.47 0.05 0.29Use 0.20 0.11User Satisfaction 0.22 0.55 Net Benets 0.14 0.64
*p < 0.05.
Information Quality has a positive effect on Net Benets, that Service Quality has a signicant positive effect on both Use and User Satis-faction, that Use has a signicant positive effect on User Satisfaction and that User Satisfaction has a positive effect on Net Benets.
In conclusion, the ePortfolio success instrument developed in this study can become a practical tool for academic institutions that areassessing ePortfolio implementation system success. Similarly, the ePortfolio success model can also help educational organizationsdetermine specic ePortfolio system dimensions in need of improvement. In addition, some initial guidelines for the implementation of anePortfolio review process have been suggested.
Fig. 3. The revised ePortfolio success model. *p < 0.05, **p < 0.001.
I. Balaban et al. / Computers & Education 60 (2013) 396411 409ITEM Source
System QualityUsabilityUsing the system is easy to learn. Existinga
Help functions are available and sufcient for using the system. ExistingbAppendix. Final ePortfolio success instrumentThe systems sitemap clearly shows the organization of materials. Modiedc
The views (i.e. selected collections of artifacts for self-presentation) are easy to manage. NewIt is possible to quickly search (e.g. using a search engine) through ePortfolio content. Modiedc
The system includes necessary features and functions for managing ePortfolio. NewFunctionalityThe system is always up-and-running as necessary. Existinga
The system is compatible with other systems I frequently use (e.g. Web 2.0 tools such as Blog,Wiki and similar).
The system can be accessed with a conventional Web browser without much preparation. NewInformation QualityValidityThe information provided by the ePortfolio is complete. Existingg
The information provided by the ePortfolio is always up-to-date. Existingg
The information provided by the ePortfolio is relevant. Existingf
The information provided by the ePortfolio is concise (contains only necessary data). Existinga
FormatThe information provided by the ePortfolio appears readable, clear and well formatted. Existinga
The information provided by the ePortfolio is easy to understand. Existinga
The information provided by the ePortfolio is in a readily usable form. Existinga
Service QualityA specic person (or group) is available for assistance with system difculties. Existingd
E-mail and other forms of on-line help are available in case of problems with using the system. Existingc
Teachers/ePortfolio support staff are helpful for using the system. NewTeachers/ePortfolio staff are competent to answer questions. NewThe institution gives the user individual attention. Existinge
Teachers/ePortfolio staff are always willing to help. Existinge
Teachers/ePortfolio staff respond promptly. Existinge
(continued on next page)
I. Balaban et al. / Computers & Education 60 (2013) 396411410(continued )
EPortfolio use is well described within the course requirements (e.g. ePortfolio tasks, evaluation ofwork in the ePortfolio, extra credits).
UseDeep Structure UsageWhile using the ePortfolio, I use available features for organizing my content. NewWhile using the ePortfolio, I collaborate with my peers in organizing ePortfolio content. NewWhile using the ePortfolio, I use features that help me to join the groups. New
Facilitating ConditionsWhile using the ePortfolio, I use features that help me to set view permissions for different views(ePortfolios).
I have the knowledge necessary to use the system. Existingd
I was able to complete a task using the system even if there was no one around to tell me whatto do as I go.
User SatisfactionI like working with the system. Existingd
The system makes work more interesting. Existingd
Using the system is a good idea. Existingd
I nd the system useful in learning. Existingd
The degree of freedom for expressing ones own individuality is satisfactory. NewThe ePortfolio presentation capabilities (e.g. quick upload, format and presentation of personal information)are satisfactory.
Net BenetsEnhanced LearningReferences
Abrenica, Y. (1996). Electronic portfolios. USA: College of Education, San Diego State University, retrieved from. http://edweb.sdsu.edu/courses/edtec596r/students/abrenica/abrenica.html (August 2009).
Alberto, D. T., & Gianluca, Z. (2006). Web-based information systems success: a measurement model of technology acceptance and t, EuroMOT 2006 Conference. retrievedfrom: http://www.iamot.org/conference/index.php/ocs/9/paper/view/1835/847 (October 2010).
Alberto, D. T., & Gianluca, Z. (2007). Web-based information systems success: a measurement model of technology acceptance and t. In EuroMOT 2006 conference. retrievedfrom: http://www.iamot.org/conference/index.php/ocs/9/paper/view/1835/847 (October 2010).
Alter, S. (2002). Information systems: The foundation of e-business (4th ed.). New Jersey, USA: Prentice Hall.Armstrong, C. P., & Sambamurthy, V. (2000). Information technology assimilation in rms: the inuence of senior leadership and IT infrastructures. Information Systems
Research, 10(4), 304327.Barker, K. C. (2003). ePortfolio quality standards: An international development project, discussion paper. Vancouver, Canada: FuturEd, retrieved from. http://www.futured.com/
pdf/ePortfolio%20Quality%20Discussion%20Paper.pdf (January, 2010).Barnes, S., & Vidgen, R. (2005). Data triangulation in action: using comment analysis to rene web quality metrics. In Proceedings of the 13th European conference on
information systems [CD-ROM]. Regensburg, Germany.Barrett, H. C. (1998). Strategic questions: what to consider when planning for electronic portfolios. Learning & Leading with Technology, 26(2), 613. http://electronicportfolios.
org/portfolios/LLTOct98.html (October 2009).Batson, T. (2002). The electronic portfolio boom: Whats it all about? New York, USA: Campus Technology, retrieved from. http://www.campustechnology.com/Articles/2002/11/
The-Electronic-Portfolio-Boom-Whats-it-All-About.aspx (August 2009).Bisovsky, G., & Schaffert, S. (2009). Learning and teaching with E-Portfolios: experiences in and challenges for adult education. International Journal of Emerging Technologies in
Learning, 4(1), 1315.Blackburn, J. L., & Hakel, M. D. (2006). Enhancing self-regulation and goal orientation with ePortfolios. In A. Jafari, & C. Kaufman (Eds.), Handbook of research on ePortfolios (pp.
8389). London, UK: IGI Global.Burton-Jones, A., & Straub, D. (2006). Reconceptualizing system usage: an approach and empirical test. Information Systems Research, 17(3), 220246.Buzzetto-More, N., & Alade, A. (2006). Best practices in e-assessment. Journal of Information Technology Education, 5, 251269. http://jite.org/documents/Vol5/v5p251-
269Buzzetto152.pdf (September 2009).Cambridge, D., Cambridge, B., & Yancey, K. B. (2009). Electronic portfolio technology and design for learning. In D. Cambridge, B. Cambridge, & K. B. Yancey (Eds.), Electronic
portfolios 2.0: Emergent research on implementation and impact Cambridge. Virginia, USA: Stylus.Chang, C. J., & King, W. R. (2005). Measuring the performance of information systems: a functional scorecard. Journal of Management Information Systems, 22(1), 85116.
The ePortfolio encourages me to develop a positive attitude to LifeLong Learning. NewThe ePortfolio helps me to make connections between formal (i.e. structured learning within the school orfaculty) and informal (i.e. unstructured learning occurring in everyday life) learning experiences.
The ePortfolio helps me to fulll learning outcomes. NewUsing ePortfolio leads to increased transparency in evaluation. NewThe enhanced communication between me and educators enhances the chances for my success. New
Personal Growth and DevelopmentI am able to evaluate progress toward achievement of my personal goals. NewI am able to choose my co-workers among peers according to various criteria (interests) presented in ePortfolio. NewI am able to compare myself with others. NewI am able to show my personal growth and development over time. NewPotential employers can view my showcase Portfolio within the context of my institutions requirements,assessment criteria, and my personal descriptions of achievements.
Answers on 15 point Likert-type scale (1 I disagree/Completely untrue; 3 Cant decide/Neither true nor untrue; 5 I completely agree/Completely true).List of sources.
a Gable et al. (2008).b Rivard et al. (1997).c Kim et al. (2006).d Venkatesh et al. (2003).e Parasuraman et al. (1988).f Barnes and Vidgen (2005).g Fraser and Salter (1997).
I. Balaban et al. / Computers & Education 60 (2013) 396411 411Chen, H. J. (2010). Linking employees e-learning systemuse to their overall job outcomes: an empirical study based on the IS successmodel. Computers & Education, 55, 16281639.Davis, F. D. (1986). A technology acceptance model for empirically testing new end user information systems: Theory and results. Doctoral Dissertation. Massachusetts Institute of
Technology, retrieved from. http://dspace.mit.edu/handle/1721.1/15192 (January 2010).Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319340.DeLone, W. H., & McLean, E. R. (1992). Information systems success: the quest for the dependent variable. Information Systems Research, 3(1), 6095.DeLone, W. H., & McLean, E. R. (2003). The DeLone and McLean model of information systems success: a ten years update. Journal of Management Information Systems, 19(4),
930.Doig, B., Ilisley, B., McLuckie, J., & Parsons, R. (2006). Using ePortfolios to enhance reective learning and development. In A. Jafari, & C. Kaufman (Eds.), Handbook of research
on ePortfolios (pp. 158167). London, UK: IGI Global.Emmett, D., Harper, W., & Hauville, K. (2006). Creating a strategy for the implementation of the QUT ePortfolio. In A. Jafari, & C. Kaufman (Eds.), Handbook of research on
ePortfolios (pp. 410419). London, UK: IGI Global.European Institute for E-learning (EIfEL). (September 2009). Why do we need an ePortfolio? http://www.eife-l.org.Fernndez, O. L. (2008). Digital learner portfolio as a tool for innovating assessment in the European Higher Education Area. Interactive Educational Multimedia, 16, 5465.Flanigan, E., & Amirian, S. (2006). EPortfolios: pathway from classroom to career. In A. Jafari, & C. Kaufman (Eds.), Handbook of research on ePortfolios (pp. 102111). London,
UK: IGI Global.Fraser, S., & Salter, G. (1997). A motivational view of information systems success: a reinterpretation of DeLone & McLeans model. In Proceedings of the sixth Australasian
conference on information systems (pp. 119140). Australia: Curtin University of Technology.Gable, G., Sedera, D., & Chan, T. (2008). Re-conceptualizing information system success: the IS-impact measurement model. Journal of the Association for Information Systems,
9(7), 377408.Gathercoal, P., Love, D., Bryde, B., & McKean, G. (2002). On implementing web-based electronic portfolios. Educause Quarterly, 2, 2937.Gefen, D., & Straub, D. (2005). A practical guide to factorial validity using PLS-graph: tutorial and annotated example. Communications of the Association for Information
Systems, 16, 91109.Gefen, D., Straub, D. W., & Boudreau, M. (2000). Structural equation modeling and regression: guidelines for research practice. Communications of the Association for Infor-
mation Systems, 4, 178. http://www.cis.gsu.edu/wdstraub/Papers/Resume/Gefenetal2000.pdf (October 2010).Haenlein, M., & Kaplan, A. M. (2004). A beginners guide to partial least squares analysis. Understanding Statistics, 3(4), 283297. www.stat.umn.edu/wsandy/courses/8801/
articles/pls.pdf (September 2010).Henseler, J., Ringle, C. M., & Sinkovics, R. R. (2010). The use of partial least squares path modeling in international marketing. Advances in International Marketing, 20, 277319.Hickerson, C., & Preston, M. (2006). Transition to ePortfolios: a case study of student attitudes. In A. Jafari, & C. Kaufman (Eds.), Handbook of research on ePortfolios (pp. 460
473). London, UK: IGI Global.Jafari, A. (2004). The sticky e-portfolio system: tackling challenges and identifying attributes. Educause Review, 39(4), 3849.Karim, J. (2009). Emotional labor and psychological distress: testing the mediatory role of work-family conict. European Journal of Social Sciences, 11(4), 584598.Katerattanakul, P., & Siau, K. (2008). Factors affecting the information quality of personal web portfolios. Journal of the American Society for Information Science and Technology,
59(1), 6376.Kim, M., Kim, J. H., & Lennon, S. (2006). Online service attributes available on apparel retail web sites: an E-S-QUAL approach. Managing Service Quality, 16(1), 5177.Laudon, K. C., & Laudon, J. P. (2002). Essentials of management information systems: Managing the digital rm (5th ed.). New Jersey: Prentice Hall.Lawlor, S. C. (1994). Computer Information Systems (3rd ed.). Fort Worth, USA: The Dryden Press.Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel Psychology, 28, 563575.Lewis, B. R., Templeton, G. F., & Byrd, T. A. (2005). A methodology for construct development in MIS research. European Journal of Information Systems, 14(4), 388400.Lin, H.-F. (2007). Measuring online learning systems success: applying the updated DeLone and McLean model. CyberPsychology & Behaviour, 10(6), 817820.Loehlin, J. C. (2004). Latent variable models: An introduction to factor, path, and structural equation analysis (4th ed.). Mahwah, NJ: Lawrence Erlbaum Associates.Lpez Fernndez, O., & Rodrguez Illera, J. L. (2009). Investigating university students adaptation to a digital learner course portfolio. Computers & Education, 52(3), 608616.Love, D., McKean, G., & Gathercoal, P. (2004). Portfolios to webfolios and beyond: levels of maturation. Educause Quarterly, 2, 2437.Mahmood, M. A., Bagci, K., & Ford, T. C. (2004). Online shopping behavior: cross-country empirical research. International Journal of Electronic Commerce, 9(1), 930.Marcoul-Burlinson, I. (2006). ePortfolio: constructing learning. In A. Jafari, & C. Kaufman (Eds.), Handbook of research on ePortfolios (pp. 168179). London, UK: IGI Global.Markus, L., & Tanis, C. (2000). The enterprise system experience: from adoption to success. In R. W. Zmud (Ed.), Framing the domains of IT management: Projecting the future
through the past. Cincinnati: Pinnaex Educational Resources Inc.Moore, G. C., & Benbasat, I. (1991). Development of an instrument to measure the perceptions of adopting an information technology innovation. Information Systems
Research, 2(3), 192222.Mu, E., Wormer, S., Foizey, R., Barkon, B., & Vehec, M. (2010). Conceptualizing the functional requirements for a next generation e-Portfolio system. Educause Quarterly, 33(1),
retrieved from. http://www.educause.edu/eq/archives.OBrien, K. (2006). ePortfolios as learning construction zones: provosts perspective. In A. Jafari, & C. Kaufman (Eds.), Handbook of research on ePortfolios (pp. 7482). London,
UK: IGI Global.Parasuraman, A., Berry, L., & Zeithaml, V. (1988). SERVQUAL: a multiple-item scale for measuring consumer perceptions of service quality. Journal of Retailing, 64(1), 1240.Paulson, F. L., Paulson, P. R., & Meyer, C. (1991). What makes a portfolio a portfolio? Educational Leadership, 48(5), 6063.Petter, S., DeLone, W., & McLean, E. (2008). Measuring information system success: models, dimensions, measures, and interrelationships. European Journal of Information
Systems, 17, 236263.Richardson, H. C., & Ward, R. (2005). Getting what you want: Implementing personal development planning through e-portfolio. CRA, retrieved from. http://www.jisc.ac.uk/
uploaded_documents/Guidance_nal.doc (January 2010).Ring, G., & Foti, S. (2006). Using ePortfolios to facilitate professional development among pre-service teachers. In A. Jafari, & C. Kaufman (Eds.), Handbook of research on
ePortfolios (pp. 340355). London, UK: IGI Global.Rivard, S., Poirier, G., Rayond, L., & Bergeron, F. (1997). Development of a measure to assess the quality of user-developed applications. The Data base for Advances in
Information Systems, 28(3), 4458.Roldn, J. L., & Leal, A. (2003). A validation test of an adaptation of the DeLone and McLeans model. In Jeimy J. Cano (Ed.), The Spanish EIS eld, critical reections on information
systems: A systemic approach (pp. 6684). Hershey, PA: IGI Publishing. http://citeseerx.ist.psu.edu (September, 2009).Sabherwal, R., Jeyaraj, A., & Chowa, C. (2006). Information system success: individual and organizational determinants. Management Science, 52(12), 18491864.Seddon, P. B., Staples, S., Patnayakuni, R., & Bowtell, M. (1999). Dimensions of information systems success. Communications of the Association for Information Systems, 2(20),
261.Segar, A. H. (1997). Assessing the unidimensionality of measurement: a paradigm and illustration within the context of information systems research. Omega, 25(1), 107121.
http://infosys.coba.usf.edu/rm/Segars97-ScaleUnidimensionality.pdf (September, 2010).Segars, A. H., & Grover, V. (1998). Strategic information systems planning success: an investigation of the construct and its measurement. MIS Quarterly, 22(2), 139163.Shannon, C. E., & Weaver, W. (1949). The mathematical theory of communication. Urbana, IL: University of Illinois Press.Stefani, L., Mason, R., & Pegler, C. (2007). The educational potential of e-Portfolios. Routledge, Great Britain: T&F Group.Stevenson, H. J. (2006). Using ePortfolios to foster peer assessment, critical thinking, and collaboration. In A. Jafari, & C. Kaufman (Eds.), Handbook of research on ePortfolios (pp.
112123). London, UK: IGI Global.Straub, D., Boudreau, M.-C., & Gefen, D. (2004). Validation guidelines for IS positivist research. Communications of the Association for Information Systems, 13, 380427.Tenenhaus, M., Esposito Vinzi, V., Chatelin, Y.-M., & Lauro, C. (2005). A global goodness-of-t index for PLS structural equation modeling. Computational Statistics & Data
Analysis, 48(1), 159205.Torkzadeh, G., & Doll, W. J. (1999). The development of a tool for measuring the perceived impact of information technology on work. Omega The International Journal of
Management Science, 27(3), 327339.Tzeng, J.-Y. (2011). Perceived values and prospective users acceptance of prospective technology: the case of a career ePortfolio system. Computers & Education, 56, 157165.Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: toward a unied view. MIS Quarterly, 27(3), 425478.Wang, W., & Wang, C. (2009). An empirical study of instructor adoption of web-based learning systems. Computers & Education, 53, 761774.Wixom, B. H., & Watson, H. T. (2001). An empirical investigation of the factors affecting data warehousing success. MIS Quarterly, 25(1), 1741. http://www.jstor.org/stable/3250957 (March, 2010).
Development of an electronic Portfolio system success model: An information systems approach1. Introduction2. Background on ePortfolios3. ePortfolio as an information system4. Electronic Portfolio system success5. Research model6. Measurement instrument development7. Data collection8. Data analysis and results8.1. Measurement model validation8.2. Adjusting the measurement model fit8.3. Structural model testing
9. Discussion9.1. Using the ePortfolio success instrument9.2. Applying the ePortfolio success model9.3. Implementing an eP success review system
10. Limitations and future research11. ConclusionAppendix. Final ePortfolio success instrumentReferences