bruce thompson-exploratory and confirmatory factor analysis_ understanding concepts and...

185
EXPLORATORY A N D CONFIRMATORY F CTOR  N LYSIS Understanding  oncepts a n d  Applications BRUCE  THOMPSON American Ps y ch ologi cal Association • W ashington DC

Upload: angel-quintero-sanchez

Post on 07-Aug-2018

227 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 1/185

E X P L O R A T O R Y

AND

C O N F I R M A T O R Y

F C T O R

  N L Y S IS

Understanding

  oncepts

and

 Applications

B R U C E   T H O M P S O N

American Psychological Association • Washington DC

Page 2: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 2/185

Copyr igh t  ©  2004  by the  American Psychological Associat ion.  A ll  r ights reserved.

Except  as  pe rmi t t ed under  th e  U n i t e d  States  C o p y r i g ht  Ac t of  1976,  no  part  of  th i s

pub l i c a t i on   may be

  reproduced

  or

  d i s t r i bu t e d

  in any  f o rm  or by any

  means ,

  o r

  stored

  in a

database

  or

  re t r i eva l sys tem, wi tho u t

  th e

  p r io r wr i t t en pe rmiss ion

  of the

  publ i she r .

Publ ished   by

American Psychological Associat ion

750   First Street,  N E

Washing ton ,

  DC

  20002

www.apa .o rg

T o  order  In the  U .K. , Europe ,  Af r i ca ,  and the  Mi d d l e

A P A   Order

  Depa rtmen t East , copies

  may be

  ordered

  from

P.O.  Box

 92984

  Am erican Psychological Associat ion

Wash ing ton ,  D C

  20090-2984

  3 H e n r i e t ta  Street

Tel: (800)

  374-2721

  Covent

  Garden, London

Direct : (202) 336-5510

  WC2E

  8LU

  Eng land

Fax:

  (202) 336-5502

TDD/TTY:

  (202) 336-6123

Online:

  www.apa.org/books/

E-mai l : o rde [email protected] rg

Typeset

  in

  G o u d y

  by

  Wor ld

  Composition

  Services , Inc. , Sterl ing,

  V A

Pr in te r: Data Rep roduct ions , Au burn Hi l l s , Ml

Cover

  Designer: Veronica

  Brcza,

  Sterl ing,

  V A

Techn ica l /Product ion Ed i to r :

 D an

  Brach tesende

T he

  opinions

  and

  s tatements publ ished

  are the

  responsibi l i ty

  ot the

  au thor ,

  and

  such

op in ions  and  s t a t ements  do no t  necessari ly represent  th e  pol icies  of the  A me r i can

Psychological Associat ion.

Library

  of  ongress  Cataloging in Publication  Data

Thompson,  Bruce ,  9 5 —

Exploratory

  and

  conf i rmatory factor analysis

 :

 u n d e r s t an d i n g

 concepts

and

 ap p l i c a t i o n s

 /

  Bruce Thompson .—1s t

  cd .

p.  cm.

Includes bibl iographical references .

ISBN 1-59147-093-5

1.

  Fac to r ana lys i s—Textbooks .  1.  Ti t le .

BF3 9 . 2 . F3 2 T4 8 2 0 0 4

150 ' .1 '5195354—dc22

  3 596

British L ibrary Cataloguing in Publication Data

A C 1P record is av ai la ble f rom the B ri t ish

 Library .

Printed  in

  th e  United States

 o f

 Am erica

First Edition

Page 3: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 3/185

CONTENTS

Preface  ix

Chapter  1 .  Introduction  to   Factor Analysis  3

Purposes of

  Factor Analysis

  4

Two

  M a j o r  Classes

  of

  Factor Analysis

  5

Organizat ion

  of the

  Book

  7

Chapte r  2 .  Foundational

  Concepts

  9

Heuris t ic

  Example  9

Two

  Wor lds

  of

  Statistics

  11

Some Basic Matrix Algebra Concepts

  12

More Heurist ic Results

  15

Sample  Size  Considerat ions

  in EFA 24

Major

  Concepts

  25

Chapte r

  3.

  Exp loratory Factor An alysis Decision Sequence

  27

Matr ix

  of

 A ssociation

  Coefficients  28

N u m b e r

  of Factors 31

Factor Extraction Method

  36

Factor

  Rotation 38

Factor Scores 44

Major

  Concepts

  47

Ch apter 4. Com ponents Versus Factors Co ntroversies 49

Measurement

  Error Versus Sampling Error

  50

Pr incipal Components  Versus Pr incipal

Ax es Factors 53

Major  Concepts   55

Page 4: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 4/185

Chapter  5. Comparisons of  Some

  Extraction,  Rotation,

  and

Score M ethods 57

Factor Scores  in   Different  Ext rac t ion Methods  57

Structure Coeff ic ients

 i n a

Components

  Context

  61

Communal i ty Coef f ic ien t s a s  R

 

Values   61

Major

  Concepts

  64

Chap ter 6. Obl iqu e Ro tat ions and High er-Order Factors 67

M a x i m i z i n g  Sample

  F it

  Versus  Repl icabi l i ty

  69

Higher-Order Factors  72

Major  Concepts  81

Chapter  7. Six  Two-Mode Techniques  83

Cattell 's "Data Box"

  83

Q-Technique 86

Major  Concepts

  91

Chapte r  8 .  Conf i rmatory Rota t ion  and  Factor

In te rpre ta t ion

  Issues

  93

"Best-Fit" Ro tat ion Ex am ple 94

Factor Interpre ta t ion Considerat ions  96

M a j o r

  Concepts  98

Chap ter 9. Intern al  Repl icabi l i ty  Ana lyses 99

EFA   Cross-Validation  101

EFA   Boots t rap

  102

Major

  Concepts  106

Chapter  10.  Co nfirm atory Factor Ana lysis Decision Sequence  ... 109

CFA

  Versus

 EFA 110

Preanalysis

  Decisions 114

Postanalysis Decisions

  127

M aj o r

  Concepts  131

Chapte r  11.  Some  Confi rmatory Factor Analys is

Interpre ta t ion Principles  133

Model Ident i f icat ion  Choices   134

Different  Es t imat ion

 Theories

  138

CFA   Pat tern  and  Structure Coefficients  138

Comparisons

  of

  Model Fits

  142

Higher -O rder M ode ls 145

M a j o r  Concepts  147

vi CONTENTS

Page 5: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 5/185

Chapter  12.   Testing  Mode l Invar iance  153

Same  M ode l ,  Different  Est imates 153

S am e M ode l ,  Equa l  Es t imates  155

Same  M ode l , Test  Invar iance of Param ete r s 159

M a jo r  Concepts  159

A ppe nd ix

  A:

  L ibQU A L + ™ D a ta

  163

A ppe nd ix

  B :

  SPSS   S yn t ax

  to

  E xe cu t e

  a

  "Best-Fit"

Fac tor Rota t ion   169

Notation  177

Glossary

  179

References

  183

I nde x  191

About the  A u t h o r  195

CONTENTS

  vii

Page 6: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 6/185

PREFACE

Investigation

  of the

  s tructure under lying var iables

 (o r

 people,

  or

  t ime)

has  intrigued social scientists since  th e  early origins  of psychology.  Conduct -

in g   one s

  first

  factor analysis can  yield  a  sense  of awe  regarding  th e  power

of  these  methods

  to

  inform judgment regarding

  the

  dimensions underly-

ing constructs.

This  book presents  the  impor tant  concepts   required for implement ing

two disciplines of factor analysis: exploratory factor analysis (EFA) and

confirmatory   factor analysis (CF A). T he book may be uniq ue in its

  effort

to  present  both analyses within  the single rub ric of the  general l inear model.

Throughout   th e  book  canons  of  best factor analytic practice  are   presented

and

  explained.

The  book  has

 been

 wr i t ten  to str ike a  happy medium between accuracy

and completeness versus overwhelm ing  technical  complexi ty .

  A n

  ac tua l

data set, randomly drawn

  from  a

  large-scale international study (cf.

  Cook

6k Thompson,  2001) involving faculty  and  graduate student perceptions  of

academ ic libraries, is presented in Appe ndix A. Throughout the book  differ-

ent combinations of

  these

  variables and participants are used to il lustrate

EFA

  and CFA

  applications.

Real u nde rstand ing of s tat is t ical concepts comes wit h applying these

methods

 i n

 one s

 o w n

  research.

 The

  next-best learning opportunity involves

analyzing

  real data provided

  by

 others.

  You are

  encouraged

  to

  replicate

  the

analyses

  reported

  in the

  book

  and to

 conduct methodological var iat ion s

 on

the reported analyses. You also may  find  it useful  to conduct complementary

analyses to

  those reported here (e.g.,

  if 11

  variables  from

  100

  faculty

  are

analyzed,  conduct

  th e

  same analysis using data  from

  the 100

  graduate

s tudent s ) .

Page 7: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 7/185

There  are also other data sets  that  have a long history of use in

heuristics

  in the factor analytic context .  For exam ple, the d ata reported hy

Holz inger

 and Swineford (1939) are widely reported. Readers are encouraged

to   access  and  analyze these  and  other data sets.

To  improve  the  instruct iona l effect iveness of the  book, some liberties

have  been  taken in using Am erican Psychological Association (A PA ) style.

For

  example,

  to

  d is t inguish between  score  world

  an d

  area world  statistics,

area world statistics are often presented with percentage signs, even across

the rows of a single table, for emphasis and cla rity. And more  t han  th e

usual num ber of decim al places may be reported, to help the reader com pare

reports  in the  book with  the  related reports  in  computer outputs ,  or to

dis t inguish

  one  block  of  results from  ano ther  by using different  numbers of

decimal places

  for

  different  results.

I

 appreciate

  in

 this regard

  th e

  l a t i tude

 afforded

  me by the A PA

  produc-

tion

  staff.

  I also app reciate the o pp ortu nit y to wr ite the book in my own

voice.  This  book has been wr itten as if I were speaking direc tly to yo u, the

student ,

  in my own

  office

  or

  classroom. Fortunately,

  I

 have

  a

  very

 b ig

  office

I

 also apprec iate  th e  thoughtful suggestions of Randy A rnau , Univer s ity

of  Southern  M ississippi, Susan Crom well,  Texas

  A& M

  Univers i ty ,

  and

Xitao  Fan, University of Virginia, on a

  draft

  of the book. Notwithstanding

their suggestions, of course  I remain responsible for the  book  in its final form.

PREFACE

Page 8: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 8/185

E X P L O R T O R Y

AND

  O N F I R M T O R Y

F C T O R   N L Y S I S

Page 9: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 9/185

1

INTRODUCTION   TO

FACTOR ANALYS IS

The  early  1900s  were exciting times  fo r  social scientists. Psychology

was

  emerging

  as a

  formal

  discipline.

  In

  France, Alfred Binet created

  a

measure

  of in tel lectual performance  that  w as the forerun ner to modern

IQ tests. But  from  the very outs et  there  were heated controversies

regarding

  definition s of intelligence and related questions abou t how to

measure  intell igence.

Some psychologists took the vie w that  in tel l igence was a s ingle general

abi l i ty .

  Those  taking th is

  view

  presumed

  that

  ind iv idua l s who w ere

  very

bright in one area of performa nce tended to perform

  wel l

  across the  full

range of in tel le ctu al tasks. This cam e to be called the general , or  G, theory

of

 intell igence . Other psychologists argued that  individuals may be prof icient

at one

  task,

  bu t

  that

  expertise

  in one

  area

  has

 l i t t le

  or no

  predict ive abi l i ty

as  regards other  areas of cognitive performance. These  latter psychologists

argued  that intell igence involved  specific  factors that correlated  min ima l ly

with

  each

 other.

However, in these early years psychologists who wished to test their

dive rgen t

  views

 had few

 analy t ic tools w ith which

  to do so. To

  address this

void,  Spearman (1904) conceptual ized methods that today we term factor

analysis.

  In the  intervening decades, numerous statistical developments

mean that today

  the

  analyst confronts

  an

  elaborate array

 of

 choices wh en

using

  these methods.

Page 10: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 10/185

Of

  course,

  for

  many decades factor analysis

 was

  infrequently

  used.

  In

the era of statistical analysis via hand-cranked calculators, the computations

of factor analysis were  u s u a l l y  overwhelming. In part this was also because

responsible researchers conducting analyses prior

  to the

  advent

  of

  modern

computers  felt  obligated

  to

  calculate  their results repeatedly, until

  at

  least

two

  analyses

 of the

  data yielded

  the

  same results

Happily,

 today the

  v e r y ,

 very complicated calculations of modern factor

analysis  can be  accomplished  on a  microcomputer  in  only moments,  and

in

  a

  user-friendly

  point-and-click environment.

  As a

  consequence, today

factor analytic reports comprise

 as

 many

 as 18% to 27% of the

 articles

 in

 some

journals (Fabrigar, Wegener, MacCallum,  &

  Strahan,

 1999; Russell, 2002).

PURPOSES OF

  FACTOR

  ANALYSIS

Factor analytic methods can be used for various purposes, three of

which

  are

 noted

  here.

 First, when

  a

 measure

  and its

 associated scoring keys

have been developed,

  factor

  analysis  can be   used   to   inform

  evaluations

  of

score

  validity.

V a l i d i t y  questions  focus

  on

  whether scores measure "the correct some-

thing." Measures producing scores  that  are  completely random measure

"nothing" (Thompson,  2003).

  Such

  tools yield scores  that  are

  p e r f ec t l y

unreliable.  We  never develop tests  to  measure  nothing,  because  we  have

readily  available tools, such  as dice  and  roulette wheels,  that we can use to

obtain random scores. Obviously,

 for

 scores intended

  to

  measure something

(e.g.,

  intelligence, self-concept),

  the

  scores must

 be

  reliable.

  But  r e l i a b i l i t y

is

 only  a  necessary—not  a sufficient—condition for validity.

V a l i d i t y questions involve issues such

  as,

 "Does

 the

  tool produce scores

that  seem  to  measure  the  intended dimensions?"  and  "Are items intended

only

  to

  measure

 a

  given dimension actually measuring

 and

  only measuring

that dimension?" These are exactly the kinds of questions that factor analysis

w as created  to  confront.  So it is not

  s u r p r i s i n g

  that  historically "construct

v a l i d i t y

  has

  [even]

  been

  spoken

  of as ...

  'factorial validity'

 "

  (Nunnally,

1978, p.

  111).

Joy Guilford's discussion some

  50

 years

 ago

 illustrates this application:

Val id i ty ,  in my opin ion  is of two

  kinds .

  . . . The  factorial validity

  [italics

added]  of a  test  is given by its . . .  [factor  weights on]  m e a n i n g f u l ,

common,

  reference

  factors.

  This

  is the k ind of  va l id i t y

  t ha t

  is  really

m e a n t w h e n  th e

  ques t ion

  is

  asked

  "Does

  this test

  measu re

  w h a t

  it is

supposed   to  measure? A mor e p e r t i n en t

  ques t ion

  should he "What

does this

  test

  measure?

The  answer then  should  be in  t e r m s  of  factors.

EXPLORATORY   AND CONFIRMATORY

  FACTOR

  ANALYSIS

Page 11: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 11/185

... 1 pred ic t  a t i me when a n y  test  a u t hor  wil l  b e  expected  to present

in forma t ion rega rd ing

 t he

  factor

  compos i t ion of his  [sic] tests.  (Gui l fo rd ,

1946,

  pp.

 428, 437-438)

Although modern  v a l i d i t y  language does not recognize a type of  v a l i d i t y

called "factorial validity," factor analysis nevertheless

  is

 h e l p f u l

  in

  addressing

construct  v a l i d i t y questions.

Thus,  Gorsuch (1983)

  has

  noted  that

  "a

  prime

  use of

 factor analysis

has

 heen

  in the  development  of

 both

  the  operational constructs for an area

and  the

  operational representatives

 for the

  theoretical constructs"

  (p.

 350).

And Nunnally (1978) suggested that "factor analysis is intimately involved

with questions

  of

 validity.

 . . .

 Factor

  a n a l y s i s is at the

  heart

 of the

  measure-

ment  of psychological constructs" (pp.  1 1 2 — 1 1 3 ) .

Second, factor analysis

  can be

  used

  to

  develop  theory

  regarding  th e

 nature

of

  constructs At  least historically,

  e m p i r i c a l l y

  driven theory development

was sometimes held

  in

 high regard.

 In

 some

 of

 these applications, numerous

d i f f e r en t  measures are administered to various samples, and the results of

factor  analysis then  are used to

  specify

  construct dimensions. This applica-

tion,

  unlike  the first  one,  is inductive.

A

  classic example

  is the

  structure

  of

  intellect model developed

  by

Guilford  (1967).  Guilford  administered dozens

 of

 tests

 of

 cognitive abilities

over

  years to

 derive

 his

 theory,

  in

 which

 he

 posited that  intelligence consists

of  more  than

  100

  d i f f e r en t  abilities that

  are

  independent

  of

 each

  other.

Third,  factor analysis can be   used   to   summarize  relationships in the  form

of

  a

 more parsimonious

  set of

  factor

  scores

  that

  ca n

  then

  be

  used

  in

 subsequent

analyses  (e.g., analysis of variance, regression, or descriptive discriminant

analysis).

 In this application, unlike the first two, the factor analysis is only

an  intermediate step  in  i n q u i r y ,  and not the final  analysis. Using

  f e w e r

variables  in  substantive  a n a l y s i s  tends  to  conserve

  degrees

 of  freedom  and

improve power against Type

  II

  error. Furthermore, factor scores

  can be

computed

  in

  such

  a

  manner  that

  the

  unreliable variance

  in the

  original

variables

  tends

  to be

 discarded

  once  the

  original scores

  are

  reexpressed

  in

a smaller set of factor scores.

TWO  M A J O R CLASSES OF FACTOR

  ANALYSIS

There  are actually two discrete classes of factor analysis: exploratory

factor

 analysis (EFA) and confirmatory factor analysis (CFA). Analyses such

as

  those originally proposed

  by

  Spearman (1904)  have

  now  come  to be

called exploratory factor analysis.

 In

  EFA,

  the

  researcher

  may not

  have

  any

specif ic

  expectations

  regarding

  the

  number

  or the

  nature

  of

  underlying

constructs

  or

  factors. Even

  if the

  researcher

  has

  such expectations

  (as in a

INTRODUCTION  TO

  FACTOR

  ANALYSIS

Page 12: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 12/185

val id i ty inv es t iga t ion ) ,

 EFA

 does

 not

  require

 the

  researcher

  to

  declare these

expectations, and the  analysis  is not influe nce d by these expectations.

Confi rmatory  factor

  analysis

  methods were developed much more  re-

cent ly  (cf. Joreskog, 1969). These  analyses requ ire that the researcher mu st

have

  specific

  expectat ions regarding

  (a) the

  n u m b e r

  of

  factors,

  (b)

  wh i ch

variables  reflect  g iven  factors,

  and (c)

  whether

  th e

  factors

  are

  correlated .

CFA  expl ic i t ly  and directly tests the fit of  factor  models.

Researchers without theories   cannot  use CFA, but researchers with

theories usua l ly find CFA  more

 useful

  tha n EFA. Confi rmatory factor

 analysis

is  more  useful  in the  presence  of  theory because  (a) the  theory  is di rect ly

tested

  by the

  analysis

 and (b) the

  degree

  of

 model

  fit can be

  quan t i f ied

 in

var ious

  ways.

  Of  course, some researchers begin  an

  inquiry

  wi t h

  specific

theoretical expectations, but after  invoking CFA find that their exp ectat ions

were  wi ldly  wrong, and may

  then

  revert to E FA procedu res .

Alth ou gh factor analysis has been ch aracteriz ed as one of the m ost

powerful  methods  yet for reduc ing var iab le complex i ty to greater s implici ty

(Kerlinger,  1979, p. 180) and as the  furthes t  logical developm ent and

re igning  queen of the correlat ion al methods (Ca t tel l , 1978, p . 4) , the

methods have been harshly  crit icized  by others (cf. Armstrong, 1967). For

example ,  with respect to research in the area of com m un ication , Cron khite

a n d L i s k a  (1980) noted:

A p p a r e n t l y ,  it is so  easy  to  f ind  semantic  scales  which

 seem

  re leva n t  to

[ information] sources ,  so easy to name or descr ibe potential/hypothetical

sources,

  so

 easy

  to capture college

  s tuden ts

  to use the scales to  rate  the

sources ,  so  easy  to  s u b m i t

  those

  ra t ings  to  fac tor ana lys is ,  so much fun

to name the

  fac tors  when

  one's  research

  assistant  re tu rns

  with the

computer

  p r i n t o u t ,  and so  rewa rd ing  to have a guaranteed publication

w i t h  no

  fe a r

  of  nonsignif icant resu l ts

  that

  researchers, once exposed  to

the  pleasures  of the factor analytic

 approach,

  r a p i d l y become addicted

to it . (p.  1 0 2 )

However, these  cr i t ic isms  apply more

  to EFA

  than

  to

  CFA, because

the only danger in EFA would he that factors might not be in terpretable ,

in

  which case

  th e

  resul ts

 may be

  regarded

  as

  nega t ive .

  But CFA

  p resumes

the invocation of  specific  expectations, w ith the attend an t p ossibili ty that

the predetermined theory in

  fact

  does not fi t , in which case the researcher

m ay  be

  unable

  to

  exp la in

  the

  relat ionships among

  the

  var iab les be ing ana-

lyzed.  Both EFA and CFA remain  useful  today, and our selection between

the two classes of  factor  analysis general ly depends on wh ether w e have

specific  theory regarding data s t ructure.

But  even researchers who only conduct research in the presence of

such theory

  mus t

  unders tand EFA. Mastery

 of CFA

  requ i res und ers tand ing

of  EFA, which

  was the

  his torical precursor

 t o

  CFA. Indeed,

  a

 f u n d a m e n t a l

EXPLORATORY  AND

  CONFIRMA TORY FACTOR ANALYSIS

Page 13: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 13/185

purpose of this hook is to  teach  the reader what is common  and what is

different  across these  tw o  classes  of  factor analysis .

Understa nding EFA also may facili tate deeper insight

  into

 other statis-

t ica l

  analyses .

 A n

  important idea

  in

  statistics

  is the  not ion  of the

  general

l inear model (GLM;

  cf.

  Bagozzi,  Fornell,

  &

  Larcker, 1981;  Cohen,  1968;

Kn a p p , 1978). Central

  to the GLM are

 realizations tha t a ll statistical analyses

are correlat ional and can  yield  effect  sizes analogous t o r and apply weights

to   measured variables to  obtain scores  on the  composite variables  tha t  are

actual ly

 t he

  focus

 o f all

 these analyses (see Thom pson, 1991).

 In

 fact , because

all

  parametric methods (e.g. , analysis of variance, regression, multivariate

analysis

  of

  var iance)

  are

  special cases

  of

  canonical correlation  analysis,

and one kind of EFA is  always  im plic it ly invoked in canonical analysis

(Thompson, 1984,

  pp.

  11-13),

  it can be

  shown that

  an EFA is an

 impl ic i t

part of all parametric statistical tests. And as Gorsuch (2003) recently

pointed out ,  no  factor analysis  is  ever completely exploratory (p .  143) .

ORGANIZATION  OF THE  BOOK

The  book  is  organized  to  cover  EFA first,  followed  by  CFA.  The

concepts und erlying EFA generally have c ounterpa rts in CFA. In fact, a

major

  focus  of the hook is com m unic ating the l inkages between EFA and

CFA.

  In

  general,

  if a

  statistical issue

  is

  important

  in

  EFA,

  the

  same issue

is

  impor tan t

  in

  CFA.

  For

  example ,

 jus t

 as EFA

 pat tern

  and

  s t ruc tu re

  coeffi-

cients  are

 both

  important  in  interpretation when factors a re correlated,  the

same  sets of

  coefficients

  are impo rtan t when factors are correlated in CFA.

Chapter  2 covers some fou nda tiona l statistica l topics, such as score

world

  versus area

 w orld

 statistics,

 and

  basic m atrix algebra concepts.  Chapters

3   through  9  elaborate  EFA decisions  and a naly tic choices. Finally , chapters

10 through

  12

 cover

  CFA

  methods.

Each of the chapters concludes with a section called M ajor Concepts."

These  sections

  are not

  in tended

  to  summar ize  all the key

  points

  of the

chapters. Instead, these sections emphasize major concerns  tha t  are  i n s u f f i

ciently recognized in contemporary practice, or that otherwise might get

lost

  in the

  details

  of

  chapter coverage.

INTRODUCTION  TO  FACTOR

  ANALYSIS

Page 14: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 14/185

2

FOUNDATIONAL  CONCEPTS

A simple heurist ic example of exploratory factor analysis (EFA) may

m a k e  the  discussion  of  some founda tional concepts m ore  concrete. Along

the way a few elem entary ideas involvin g a type of ma them atics called

matrix

  algebra

  will be introdu ced. It is som etimes jo kin gly said

  that

  matr ix

algebra invariably induces psychosis.  It is t rue

  that

  matr ix a lgebra  in  some

respects

  is

 very diffe rent from

 th e

  simple algebra taught

  in

 high  school.

  A nd

knowing the mechanics of matrix algebra thankfully is

  not

  necessary for

unders tanding factor analys is .

  But we do

  need

  to

  know  that  some basic

concepts  of matrix algebra (such as when  matrices can be multiplied and

how

  div i s ion

  is

  performed wi th matr ices)

  are

  important , even though

  the

mechanics  of the  process  are  unimportant .

HEURISTIC   EX AMPLE

Table

  2 .1

 presents hyp othetica l data in volv ing seven students ' rat ings

of

  me reg ardin g how strongly th ey ag ree (9 = mo st agree) versus disagree

(1 =

  most

  disagree)  that  I am

  Handsom e, Bea utiful , U gly, Bril l iant , Smart ,

and  Dumb. Because  these  var iables a re di rect ly measured, w i thout applying

any

  weights (e.g., regression

  (3

 weights)

  to the

  scores,

  these

  variables

  are

called synonymously  measured

 or  observed

  variables.

As we

  shall  soon  see, scores  that

  are

  obtained

  by

  applying weights

  to

th e

  observed scores

  are

  synonymously called  composite, latent,

  or

  synthetic

variables.

  A ll

  analyses within

  the

  general l inear model (G L M ) yield scores

Page 15: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 15/185

TABLE 2.1

Heuristic

  EFA

  Data

Student/

Statistic

Barbara

Deborah

Jan

Kelly

Murray

Susan

Wendy

Mean

S

Measured variable

Handsome

6

8

9

5

4

7

3

6.00

2.16

Beautiful

5

7

8

4

3

6

2

5.00

2.16

Ugly

4

2

1

5

6

3

7

4.00

2.16

Brilliant

8

7

9

9

9

7

7

8.00

1.00

Smart

6

5

7

7

7

5

5

6.00

1.00

Dumb

2

3

1

1

1

3

3

2.00

1.00

fo r  eve ry pa r t i c ipan t on at  least one  composi te var iable , and  often  on several

composi te var iables .

When we  conduct  a  fac tor analys is , we are  explor ing  the  relationships

among measured var iables and  t r y ing to  de te rmine w he the r  these relation-

ships  can be  summar ized  in a  smal le r number  of  latent  constructs. Several

different  statistics

  can be

  used

  to

  summar ize

  the

  re la t ionships among

  the

variables (e.g. , Pearson correlation coefficients, Spearman's rho coeff ic ients) .

Here  we use  Pearson  product-moment  correlation coefficients.  Table  2.2

presents  the  ma t r ix  of Pearson  correlation coefficien ts  for all the  pairwise

com bination s of these six variables.

The  Table 2.1 data  set  involves re la t ively few pa r t i c ipan ts  (n = 7) and

re la t ive ly  few va riable s ( i> = 6) .  These  features make the data set easy to

unders tand. However ,  the  data  and the  resul ts  are  unreal is t ic . Real  EFA

problems involve more par t ic ipants

  and

  more variables. Factor analysis

 is

usually

  less re levant

  when  w e

  have

  so few

  variables, because

  w e

  gain li t t le

by  sum ma r iz ing re la t ionships among so few measured var iables in a smal ler

set of  la tent var iables .

TABLE 2.2

Bivariate Correlation Matrix

Variable

Variable

Handsome

Beautiful

Ugly

Brilliant

Smart

Dumb

Handsome

1.0

1.0

-1.0

.0

.0

.0

Beautiful

1.0

1.0

-1.0

.0

.0

.0

Ugly

-1.0

-1.0

1.0

.0

.0

.0

Brilliant

.0

.0

.0

1.0

1.0

-1.0

Smart

.0

.0

.0

1.0

1.0

-1.0

Dumb

.0

.0

.0

-1.0

-1.0

1.0

10

E XPLO RA TO RY A N D   C O N F I R M A T O R Y F A CT O R

  ANALY S I S

Page 16: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 16/185

Furthermore,  we  would rarely confront  a  correlation m atr ix, such  as

that  in Table

  2 . 2 ,

  in which the

  correlation coefficients

 are

 only -1.0, +1 .0,

or 0.0. N or would we

 tend

 to see relationships fo r which  th e  latent constructs

are so

  obvious

  from

  a

  s imple examinat ion

  of the

  correlation matrix even

without performing any statistical analyses.

An examination of the Table 2.2 correlations suggests  that  two factors

(i.e.,

 latent variab les) underlie the covariations  in the Table  2 .1 data. Three

patterns

  are

  obvious

  for the

  heuristic example.

First,  one

  factor involves perceptions

  of my

  physical attractiveness.

Three

 variables have  r  values

 that are 1.0 (or

 100%): Handsome,

  Beautiful,

and  Ugly, although scores  on  Ugly  are  scaled  in the  opposite direction

from

  scores on Handsome and

  Beautiful.  Second,

  another factor involv es

perceptions

  of my

  intellectual prowess.

  Three

 variables have

  r

  values

 that

are

  1.0 (or

  100%): Brilliant, Smart,

  and

  Dumb, although scores

  on

  Dumb

are   scaled in the opposite direction  from  scores on Brilliant and Smart.

Third,  th e  fact  that  all  pairwise combinations  of r  values  are 0.0 (or

  0 % > )

for  combinations involving one  measured variable from  one set  (e.g., Hand-

some) and one measured variable from the other set (e.g., Brilliant) suggests

that

  the two  latent variables also  are  uncorrelated with each

  other.

TWO  W O R L D S OF STATISTICS

All  statistics can be conceptualized  as existing in or describing dynam ics

of

  one of two "worlds": the

  score

  world or the

  area

  world. All statistics in

the score world are in an unsq uared me tric, as are the scores them selves.

For  example, if we have d ata on how many dollars our classmates have on

their persons,

  each

 measured var iab le score  is in the  me tric, dollars. Stati stics

in the score world have implicit superscripts of 1. Example statistics in this

world  are the  mean,  the  median,  the  standard deviation (SD),  and the

correlation

  coefficient

  ( r ) .

If  the  mean  is 7, the  mean  is 7

  dollars.

  If the SD is 3, the SD is 3

dollars.

  The  weights (e.g., regression  (3 [beta] we ights , regression

  b

 weights)

applied

 to

 measu red variables

 to

 obtain scores

  on

 composite variables

 are

 also

in the  score world  (o r  they couldn't  be applied  to the  measured variables ).

The

  statistics

  in the

  area world

  are in a

  squared metric.

  For

  example,

i f   the scores are mea sured in dollars, and the v arian ce is 9, the variance

(SD

2

) is 9 squared  dollars  (not  9 dollars). M any of the  statistics in  this world

have explicit su perscripts of 2 (e.g., r

2

, R

2

) or names

  that

  communicate

  that

these statistics

 are

 squared valu es (e.g.,

 the sum of

 squares,

 the

 mean square).

Other

  statistics

  in

  this world

  do not

  have either superscripts

 of 2 or

  names

clearly

  indicating

  that

  they

  are in the

  score world (e.g.,

  th e

  variance,

  th e

covariance).

FOUNDATIONAL   CONCEPTS  11

Page 17: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 17/185

TABLE 2.3

Unsquared and Squared Factor Weights

Factor  Squared values

Measured variable

Handsome

Beautiful

Ugly

Brilliant

Smart

Dumb

 

1.0

1.0

-1.0

.0

.0

.0

II

.0

.0

.0

1.0

1.0

-1.0

 

100

100

100

0

0

0

II

0

0

0

100

100

100

Any

  statistics

 we

 compute

  as

 ratios

 of one

  area world stat ist ic

 to

 another

area

 world stat ist ic

 are

  also themselves

  in the

  atea world .

 For

 example,

  both

R

2

  and the analysis of variance  (ANOVA)  effect  size  T |

2

  (eta-squared ) are

computed

  as:

sum of

  squares  E X P L A I N E D

su m

  of squaresjoTA L

Rel iab i l i ty

 coefficients  (e.g., Cronbach's  a) are also  in the  area world,

because they

  are

  computed

  as

  ratios

  of

  reliable

  variance  or

  rel iable

 sum of

squares divided by total score v arian ce or the total sum of squares. R eliab i l i ty

statistics

  are in the

  area world ,

  even

  though

  they

  can be

  neg ative (e.g.,

-.3),  and indeed  even though they can be less than  -1.0 (e.g., -7.0;

  Thomp-

son, 2003).

Table

  2.3

  presents

  the

  unsquared factor weights (analogous

  to

  regres-

sion

  (3

 weigh t s )

  for the

  Table

  2 .2

  correlat ion mat r ix (and

  the

  Table

  2 .1

data used

  to

  create

  the correlation

  ma t r i x ) .

  Table  2.3

  also presents

  the

squared valu es associated wi th  this  analysis .

 To

  help readers keep straight

which

  stat ist ics  are in the  area versus  the  score world,  in  many cases  I

present  these  statistics

  as

  percen tages (e.g.,

  r = 1.0 =

  100%).

The  Table 2.3 results from the  factor analysis of the Table 2.2  correla-

tion

  mat r ix confi rm

 our

 expec tat ions.

  The

  Table

 2.3

  mat r ix (and

  the

 Table

2.1 data

 and the

 Table

 2.2 correlation

 matrix) suggests that

 two

  uncorrelated

factors

  (i .e.,

 my

  perceived physical at t ract iveness

  and my

 perceived intel lec-

tua l

  abi l i t ies) underl ie the six measured variables.

S OME   BASIC M ATRIX  ALGEBRA  CONCEPTS

As  noted

  previously ,

  it is not

  important

  to

  know

  how to do

  mat r ix

algebra (which

  is

 for tuna te, because

  as

 noted  previously this knowledge

  can

12

  EXPLORATORY

 A N D

  C O N F I R MA T O R Y F A C TO R  ANALYSIS

Page 18: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 18/185

cause severe psychological distress). Readers who desire an understanding

of  the mechanics of matrix algebra can consult any number of hooks (e.g.,

Cooley  &  Lohnes, 1971,  pp.  15-20;  Tatsuoka, 1971,  chap.  2). But it is

important

  to  know some matrix algebra

  concepts,

  so  that  one can

  have

some understanding

  of

 what

  is

 being done

  in

  certain analyses

Different  books

  use  di f ferent

  notations

  to

  indicate

  that  a

  matrix

  is

being discussed. In this book, capital letters in bold are used to represent

a  matrix (e.g.,

 X, R).

  There

  are six

 matrix algebra concepts  that  must

 be

mastered.

  These

 are discussed below.

Rank

In this book we

 p r imar i l y

  consider matrices

 that

 are two-dimensional

(i.e., number of rows > 1, and number of columns  > 1). Matrix rank  character-

izes how

  many rows

 and

  columns

  a

 particular matrix has.

  When  we  specify

rank, we always report the number of rows first, and the number of columns

second, using subscripts.

 For

 example,

 the

  measured variable scores reported

in  Table  2.1  could  be  represented  as: X

7x6

.  And the  bivariate correlation

mat r ix

  presented  in  Table  2 . 2 i s

  R(

1X

6.

Transpose

For

  some purposes

 we can

  reorganize

  the

  data

  in a

  matrix such  that

the

  columns

  in the

  original matrix become

  the

  rows

  in a

  second matrix,

and the

  rows

  in the

  original matrix become

  the

  columns

  in the

  second

m a t r i x .  Such  a new matrix is called a  transpose.  We designate a transpose

by  putting a single quotation or tick mark on the matrix. For example, if

w e  have  a  matrix,  Y ^

X

2 ,  the  transpose  of  this  m a t r i x  is  ¥ 2 x 3 ' -

Consider

  the following matrix, ¥3

 x

 i\

1

  4

2   5

3

  6.

The

  transpose

  of

 this matrix,

 ¥2

 x

 /, is:

1 2 3

4 5 6 .

Conformability

In  regular  (nonmatrix) algebra, any two numbers can be multiplied

t imes each other

  (e.g., 4x9 = 36; 2 x 8 =  16). In matrix algebra, however,

matrices  can be  multiplied times each  other  only  if a  certain  condition is

met: The number of columns in the

  left

  of two matrices being  mu l t ip l i ed

FOUNDATIONAL

  CONCEPTS

  13

Page 19: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 19/185

mus t equa l

 t he

  num b e r

 of

 rows

 in the

  right matrix being multiplied. Matrices

me eting this condit ion (arid eligible for m ultip licatio n) are said to be  con-

formable.

  F or

  example ,

  the  Table  2 .1

  m a t r i x ,

  X y

X

5  can be

  pos tmul t ipl ied

by the Table 2.2

 correlation

 matrix, R 6

X

6 , because 6  (columns  in X) equals

6

  ( the number

  of

  rows

  in

  R .

When w e

 m ul t iply matr ices ,

  th e

  rank

 of the

  result

 o r

  product matrices

wil l  equal the num ber of rows in the lef tmost matr ix , and the num ber of

columns

  in the

  produc t ma t r ix  will  equa l

  th e

  num b e r

  of

  columns

  in the

r ightmost m atr ix . For example , the matr ices Zy

 x 6

 and W

6

  x

 2

 are conform able.

The product of these two matrices would have a rank of 7 by 2:

  Z y

x 6

W

6 x 2

  = 0

7 x 2

.

Symmetric Matrix

Some  "square" matr ices (number of rows equal number of columns)

are   symmetric,  meaning

  that

  every

  i,j

  e lement

  of the

  matr ix equals

  th e

corresponding  j,i  e lement of the same matr ix. For example , in the  Table

2 .2

  corre la t ion matr ix,

  the

  3 1  e l ement

  in R

  equ als -1.0 eq uals

  the

  1 3

e lement

  in R.

 Correlation m atr ices

 a re

 a lways sym metr ic .

 A

  related property

of symmetric matrices, which are   a lways  conformable to themselves, is that

a

  sym metr ic m atr ix a lways equals

  i ts own

  transpose,

  by

  definit ion (i .e. ,

Rfi x 6

  =

  R-6 x 6 •

Identity Matrix

In regular algebra,

 a ny

 number t imes

  1

 equals

  th e

  o r ig ina l num ber

 that

w as  m ul t ip l i ed times

  1

  (e.g.,

  9x1 =9 ; 15x 1 =

  15) .

  In

  matrix algebra

there  is a

  m a t r i x , ca l led

 the

  identity

  matr ix

  ( I) , which when mul t ipl ied t imes

any m atrix , equ als the init ial matrix . For exam ple, let 's say we have  ¥ 3 x 2 :

1

  4

2 5

3

  6.

For this example,  to be  conformable  the  ident i ty matr ix must

  have

two rows. The ide nti ty ma trix is always sym me tric. Th e identi ty m atrix

always has

 ones

  on the

  "diagonal" running

  from  the

  upper

  left

  corner  down

to the  lower right corner,  and  zeroes  in all

  off-diagonal

  locations.  So for

this

  example I

2 x 2

  would be:

1  0

0 1.

For

 this example

 ¥ 3

 x

 71

2 x 2

  (i.e.,

 Y

 pos tmul t ipl ied

 by I)

  yields

 a

  product

matrix  that  contains exactly

  the

  same numbers that

  are in

 Y }

x 2

  itself.

  T he

J  4

  EXPLORATORY

  A N D C O N F I R M A T O R Y F A CT O R  ANALYSIS

Page 20: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 20/185

identity matrix

  is a

  critical matrix algebra  concept,  because

  as we see

  next

I is  invoked  in  order  to be  able  to do matrix division.

Matrix Inverse

For symmetric matrices,

  one

  could state

  a

  matrix equation such

  as

R ( j

x

6 R 6 x 6

  =

  l e x 6 . The matrix, R < s

x(

," '  ( i .e . , a matrix with a-1  superscript),

is  called an  inverse. Not every symmetric matrix has an inverse. If a matrix

eligible

  for

  inversion (e.g.,

  a

  symmetric matrix) does

  not

  have

  an

  inverse

fo r  a

 given  data  set,

  the

  matrix

  is

 said

  to be

  "ill conditioned."

Some  symmetric matrices, such  as some correlation matrices,  do not

have  an  inverse, because  the  numbers  in the  matrix  cannot  yield  an R"

1

such  that

  R R' = I. If this

 happens

  in  factor analysis,

  this

 may be due to

sample

  size

  being too small.

Some  factor analyses

  require

  that

  an

  inverse

  be

 computed

  in

  order

  to

compute

  the

  factor solution.

  When

  using factor analyses requiring matrix

inversion, if a matr ix is ill conditioned, the solution simply  cannot  be

computed using

  the

  particular technique requiring

 the

  inversion.  Another

technique must be selected, or sample

  size

  must be increased.

In regular algebra, division is straightforward. For example, to compute

the

  F C A L C U L A T E D

  m

  ANOVA

  or in

  multiple regression,

  we

  compute

  F =

[ S u m   of  Squares  E X P L A I N E D   /  ^ / E X P L A I N E D ]  /  (Sum  of  Squares  U N E X P L A I N E D   /

^ /U N E X P L A I N E D -

But in

  matrix algebra

  the way

  that  we do

  division

 is by

 postmultiplying

the

 numerator matrix times  the inverse of the matrix with which we wish t o divide.

For example,  th e  division of 62

  x

 2 by W 2

 x

 2 is accomplished using the form ula :

62

  x

 2 W 2

  x

 2

  =

  ^2 x 2 . (This  is actually  on e  basis  fo r  obtaining  F C A L C U L A T E D

fo r a multivariate

 ana lys i s

 of variance  [MANOVA] and is directly analogous

to the

  univariate

  ANOVA

  computation, except

  for the use of

  matrix

algebra.)

Division is used a lot in univariate statistics. Division is also used a

lot in  multivariate statistics, including factor analysis. Clearly, access  to an

inverse  is usually critical  for  most factor analytic applications.

M O R E

 HEURISTIC

 RESULTS

Throughout  the  GLM, weights  are applied  to the  scores  on the  mea-

sured variables

 to

 obtain scores

 on

 composite variables.

 In the

 GLM, weights

are

  invoked  (a) to  compute scores on the

  latent

 variables or (b) to interpret

what

  the

  composite variables represent. Logically, because

  the

  weights

 are

used

  to

  obtain

  the

  scores

  on the

  composite variable,

  the

  weights must

  be

FOUNDAT1ONAL

  CONCEPTS

  15

Page 21: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 21/185

of

  some assistance

  in

  understanding

  the

  nature

 of the

  latent  variables

  (i.e.,

help in interpreting what latent variables represent).

For

  example,

  in

  mul t ip le  regression  these  weights

  are

  often called

 j3

(beta)

  weights.  A set of weights for the predictors in regression is called the

regression

  equation.

 These

  P weights in regression are also sometimes called

standardized regression

  c o e f f i c i e n t s  because they are applied to the measured

predictor variables

  in

  their t-score  form  (i.e.,

  th e

  measured variables have

been

  transformed  so

 that

 th eir means equ al zero and the ir standard deviations

and variance s equal one). It is actually a misn om er to call the weigh ts

"standardized,"

 because

  th e

  weights

 are a

 constant

 for a

 given data set. Thus,

th e

  weight

  for the

  f i r s t  predictor

  itself

  cannot

  be

  standardized.

If  p = .5,  this weight  is  applied  to the  f i r s t  predictor  z  scores  of

everybody in the sample. We

  cannot

  apply the  formula  ^ =  ( X j - X )  / SD to

a  given  weight.  If everybody  is assigned  a  weight  of Pi = .5,  SDpi  = 0,

  then

th e  division  is m athem at ical ly impermiss ible . To be  technically correct,  we

should call such weights "weights applied

  to the

  standardized measured

variables" rather

  than

  "standardized weights."

Perhaps

  we are

  performing

 a

  regression using

  tw o

  predictor variables.

Steve  has  z\ = -1 and  ^2  = .5. The  form  of the  regression equation  is:

For a  given data set ,  Pi  might equal 2.0,  and P?  might equal  — 1 . 2 .

Note

  that ,

  as is the case thr oug hou t the G LM , weights are not necessarily

correlation coefficients (see Courville  & Thompson,  2001).

 Thus,

 as in  this

example,

  it is

 possible that none

 of the

  weights will

 lie in the

  range between

-1 and +1. For  Steve,  Y

S T

E V E   =  2 .0 (0 .7 )  + -1.2(0.5)  = 1.4 +  -0.6  =  0.8.

In

 regression,

  these Y

 scores

 are the

  composite, latent,

  or

  synthetic variables.

Pattern

  Coefficients

In

  factor analysis,

  pattern

  c o e f f i c i e n t s

  are the

  weights applied

  to the

measured  variables

  to

  obtain scores

  on the

  factor  analysis  latent variables

(called

 factor

  scores) . As n oted , these w eights are analog ous to the P weights

in

  multiple regression. These weights

 are

 also analogous

  to the

 standardized

discriminant function  coefficients  in descriptive discriminant analysis (Hu-

berty,

  1994)

  and the

  stan dardized canonical function

  coefficients  in

  canoni-

cal   correlation analysis  (Thompson,  1984, 2 00 0a).

T he

  factor pattern coefficients (P y

x

  F )

 a re

  computed part ly

 to

  reexpress

th e  variance represented  in the  correlation matrix that  is  being analyzed,

and from wh ich the factors (i.e., the sets of pattern   coefficients)  are extracted.

T he

  factors

  are

  extracted

  so

  that

  the first

  factor

  can

  reproduce

  th e

  most

variance

  in the

  matr ix being analyzed,

  th e

  second factor reproduces

  th e

16  E XPLO RA TO RY  A N D  C O N F I R M A T O R Y

  FACTOR

 A N A L Y S I S

Page 22: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 22/185

second most variance,  and so  forth.  The  ability  of one or  more factors  to

reproduce the ma trix being analyzed is qu antified by the reproduced m atrix

of  associations, such as the reproduced inter var iable correlation m atr ix

( R y x V

+

) -  This

  is the  "how  f u l l  is the  glass" perspective.

The

  ability

 of the

  factors

  to

  reproduce

  th e

  matrix being analyzed

  can

also be qu antified by com puting the ma trix

 that

  is left  after a given number

of

  facto rs have been extracted .  This  is called the resid ual m atr ix of associa-

tions, such as the residu al inter var iab le correlation ma trix  ( R

V x V

~ ) -

 This

 is

the

  converse, "how empty

  is the

  glass" perspective.

If  the factor  pat tern  coefficients perfectly reproduced the matrix of

associations  (here

  an

  intervariable Pearson correlation coefficient m atri x) ,

then  R y

x

v ~ would consist entirely

  of

  zeroes, indicating that

  there  is no

information or variance

  left

  in this m atrix. An d if the factor  pat tern

  coeffi-

cients  perfectly reproduced  th e  matr ix  of  associations,

  then

  th e  entries  in

RV  x v

+

  would exactly match  the  entries  in Ry

  x

 v.

The reproduced intervariable correlation matrix (Ryxv

+

) is com-

puted  as:

Note

  that

  th e

  pattern coefficient matrix

  and its

 transpose

  are

  conformable,

and the  product matrix  has the  rank

  v

  by

 v

  (the same rank  as the  original

intervariable correlation matrix).

Possible Numbers  of

 Factors

In

  regression there

 is

 only

 a

 single "equation" (set

 o f

 weights,

 [3

 weights)

in   a given analysis. In fac tor analy sis the sets of weights (i.e., patte rn

coefficients)  are  called /actors rather than  equations, mainly  just  to be incon-

sistent  and  to  confuse  th e  graduate students.  The  number  of  factors worth

keeping  can  range between  one and the  number  of variables.

Let's

  go

  back

  to our

  example

  of

 ratings

  by

  seven participants

  of me as

regards six measured variables. If every entry in the interv ariab le correlation

matrix  was  either  a +1 or a -1, the  r

2

  between every pair  of  measured

variables

  would

  be

  100%.  This  would imply that

  a

  single factor underlay

th e  ratings.  W e  would extract  one  factor (Pg

x

  |)  consisting exclusively of

negative

  or

  plus ones.

  A nd

  this single factor w ould perfectly reproduce

  th e

original

  R

6 x

6 mat r ix .

Technically,

  there

  would  be five  addit ional factors,  each  consisting

exclusively  of  zeroes, meaning  that  they  each  contain  and  reproduce  no

information.  But we  would  never  bother  with such factors. Indeed,  we

usually  do not bother even w ith factors  that  reproduce some, but only a

FOUNDAT10NAL  CONCEPTS  I 7

Page 23: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 23/185

l i t t le information,

  if we

  deem

  th e

  amount

  of

  informat ion

  or

  variabil i ty

reproduced  to not be  noteworthy.

At the  other extreme,

  if

 somehow  all the ra tings were perfectly uncorre-

lated, then e very off-diagonal element

  of

 R

6 x 6

 would

 be a

 zero (and R ^x

 5

  =

I

6x6

).

  Now,

  no two

  variables could

  be

  combined

  to

  create

  a

  factor (i.e.,

each

  measured variable would define

  its own

  factor).

  There

  would

  be six

factors.

  Each factor would have one +1 value, and the rema ining  five  entries

would  be

  zeroes.

A t  this second extrem e (per fectly uncorrelated measured variab les),

RV  x v

+

  would exactly match  th e  entries  in Ry

  x

 v- In  fact , whenever  we

extract all possible factors (i.e., the number of factors equals the number

of  measured var iab les) , the pattern coefficients will  perfectly reproduce the

original matrix of associations being analyzed.

Factor

  Structure

  Coefficients

All GLM analyses are correlational and yield weights (e.g., factor

pattern coefficie nts) that are applied to m easured variables to obtain scores

on composite va riables.

 These

 weights are sometim es correlation  coefficients,

and

  sometimes they  are not  (cf. Courville  & Thompson,  2001;

  Thompson,

1984,  pp . 2 2 -23 ) .

Researchers can also comp ute biva riate correlation

 coefficients

  between

measured variables  an d  their  composite  variables (e.g.,  regression Y scores,

factor

 scores, discriminant function scores). These  correlations across various

GLM   analyses  are  always called structure  coefficients.

Throughout the G LM consult ing the struc ture coefficients is almost al-

ways  essential to correctly interpreting results. For example, in regression

Courvil le a nd

 Thompson

 (2001) documented  the  misinterpretations that ca n

occur when only  p  weights a re  consulted  in  interpretation (also  see  Cooley

&Lohnes , 1971, p. 55; D unla p& La ndis , 1998; Thompson &B orrello , 1985).

For

 descriptive disc rimina nt analysis, Hu berty (1994) noted that  "con-

struct definit ion  and  struc ture dime nsion [and  not hit  rates] constitute  th e

focus  [italics  added]

  of a

  descriptive discriminant analysis"

  (p .

  206).

  The

importance

  of

 structure

 coefficients

  in

  canonical correlation analysis

 is

 widely

recognized  (cf.  Cohen

  &

  Cohen,  1983,

  p.

  456; Levine, 1977,

  p.

  20) .

  For

exam ple, M eredith (1964)  noted,  "If the  variables within  each  set are

moderately intercorrelated  th e p ossibil i ty of  interpreting  th e  canonical vari-

ates by inspection of the appropriate regression weights [func tion  coefficients]

is

  practically nil"

  (p .

  55) .

Structure coefficients

 are

  equal ly important

  in

  factor analysis (cf. Gra-

ham, Guthrie , & Thompson , 2003; Thompson ,  1997). Thus, Gorsuch (1983)

emphasized

  that  a

  basic

  [italics  added]

  matrix

  fo r

  interpreting

  th e

  factors

is  th e

  factor s t ructure"

 (p .

  2 0 7 ) .

18

  EXPLORATORY  A N D

  C O N F I R M A T O R Y F A CT O R  ANALYSIS

Page 24: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 24/185

In the GLM,  sometimes  the weights are correlation  coefficients.  F or

example ,  in  regression,  if the  predictor variables are  perfectly uncorrelated,

the P weight for a given predictor equals that  variable 's correlation with

the  outcome var iab le (Yi) .

In

  factor analysis,

  the

  s t ruc ture

  coefficients  ( S v

x

p )

  can

  be

  computed

using  the  equation

P v x F

  Rp 'xF

 

S vxFi

where R

F x F

 is the

  interfactor corre la t ion mat r ix (not

  the  v by v

 intervar iab le

correlation matrix). When

  the

  factors

  are

 perfect ly  uncorre la ted,

 by  defini-

tion  R p x F

  =

  I p x F -

  an

J  therefore

  P y x F

  =

  S V X F -

In  other  words, whenever the factors are  perfectly  uncorrelated the

pattern  coefficients  are  also structure  coefficients  (and therefore must  fall

with in

  th e

  range  — 1

 to

  + 1 ) ,

  and in

  this case

  are

  more appropriately termed

pattern/structure  coefficients.

  And wh en factors are first extrac ted, the factors

are

  always

  perfectly uncorre la ted.

"Loadings"

It   is not uncom mon in factor analytic reports to see authors referring

to coefficients as "loadings." The problem is  that  this term is inherently

ambiguous.  In some reports authors

  refer

  to pattern

 coefficients

 as "loadings."

In

 other

  reports authors

  refer

  to structure coefficients as "loadings." In some

reports

 it is

 impossible

 to

 discern wh at coefficients

  th e

  authors

 are

 descr ibing.

And in

  some reports

  th e

  pa t te rn

  and the

  s tructure coefficients

  are not

equal but authors

  refer

  to both sets of  coefficients  as "loadings."  These

ambigui t ies

 cause confusion tha t impedes clear

 scientific

 d ialogue. Therefore,

some journals (see Thompson

  &

  D aniel, 1996) form ally discourage

  the use

of  this vag ue colloq uialism . For the same reasons, the term  loading  is not

used

  here.

Communality Coefficients

For the Table 2 .3 factors, because the factors are perf ectly u ncorrelated ,

the unsquared  coefficients  reproduced in Table 2 .4 are pat tern/structu re

coefficients.

  Recall  that  Pearson correlation coefficients (a) characterize

the linear relationship between two in terv ally scaled varia bles but (b) the

coefficients

  themselves  are not  inte rva lly scaled.

  That

  is, an  r  of 1.0 is  no t

twice as

  large

  as an  r  of

  0.5.

T o

  m a ke

  th e

  correlations intervally scaled,

  so

  that

  we can say how

much larger  one r is versus another  r, we  m us t first square  the rs before w e

compare  them. Thus,  the r of 1.0 is  four  times larger  than  the r of 0.5,

FOUNDATIONAL

  CONCEPTS

  19

Page 25: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 25/185

TABLE

 2.4

Pattern/Structure Communality Coefficients

 and

 Eigenvalues

Factor

Measured variable

Handsome

Beautiful

Ugly

Brilliant

Smart

Dumb

Sum of

 squared column values

 

1.0

1.0

-1.0

.0

.0

.0

3.0

II

.0

.0

.0

1.0

1.0

-1.0

3.0

 

1

 00

100

100

100

100

1 00

Note

The eigenvalues (3.0 and 3.0) are in a squared, area-world metric, even though they are not pre-

sented as percentages.

because

  r

2

  =

  l.O

2

 =

  100%

  is fou r

  times larger than

  r" =

 0.5""

 =

  25%. Because

the  Table  2.4  structure coefficients  are  correlation coefficients, they  too

must

  be squared before they are compared.

One set of manipulations involves taking the squared structure  coeffi-

cients  for the  uncorrelated factors (the formula  is  d i f fe rent  for  correlated

factors)  and

  adding

  the

  squared values along

  the

  rows across

  the

  columns

(e.g., 100%

  + 0% =

  100%).

  The

  resulting coefficients, called

  communality

coefficients   h

2

) ,  are area-world statistics. Because  these factors  are  uncorre-

lated,  the  variance  each  measured variable shares with  a  factor  is  unique

or nonoverlapping, and thus the

  h

  value for a measured variable indicates

ho w much of  th e variance  in a measured variable the

 factors

  as a se t can  reproduce .

If a measured variable had a communality coefficient close to 0%, this

would

  mean

  that  this variable is not  being represented within  the  factors.

If the

  researcher desires

  the

  variable

 to be

 represented

  in the

  factors,

 it may

be

  necessary

  to

  extract additional factors.

An

 equally valuable perspective  takes

 a

 converse view

 of the

 interpreta-

tion  of the communality coefficients. The  h  value for a measured variable

reflects

  ho w

  much

  of the

  variance

  of a

  given measured

  variable was

  useful

  in

delineating the

  factors

  a s a

  set.

There

  is yet another

 perspective from which communality coefficients

can be viewed. The communality coefficient for a variable is a "lower bound"

estimate of the  re l iabi l i ty  of the scores on the variable. This means  that if

a  variable has an  h

2

 of 50%, we believe  that  the  reliability of the  scores  on

the variable is no lower  than  .5 (and, of course, may be higher).

In

  some unusual cases communality coefficients greater  than  100%

occur.  These  are  called Heywood cases (Heywood, 1931).  Such  solutions

are

  statistically "inadmissible,"

  and

  they indicate fundamental problems

s imi la r

  to

  estimating variances

  to be

  negative.

20

E XPLO RA TO RY   A N D C O N F I R M A T O R Y

 FACTOR

  ANALYSIS

Page 26: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 26/185

Table  2 .4

  presents

  th e

  communal i ty coeff ic ients

  fo r  each  of the six

measured

  variables  in the  heurist ic example. Each  of  these  is  100%.  This

reflects  th e  fact  that  for the  present example,  if we  reexpress  th e  data

presented  in  Table  2.1 in the  form  of  scores  on the two  factors presented

in

  Table

  2.3,

  we

  lose

  no

  informat ion whatsoever

Eigenvalues

For the  Table  2 .3

  factors,

  if we add the

  squared s t ruc ture coeff ic ients

down

  th e

  columns across

  th e

  rows,

  we get a

  very important

  set of

 squared,

area-world stat is t ics called e igenva lues (X ). As is the case e lsewhere in statis-

tics, important  concepts  are  given multiple names,  to  confuse everybody;

natural ly  it is the best use of  effort  to be confusin g to focus such atten tion

primari ly on the  most important concepts,  by giving them the  most synony-

mous names . Another synony m ous term for eigenvalues is characteristic roots.

T he  eigenvalues  fo r  this s imple heurist ic example  are 3.0 and  3.0,  as

reported in Table  2 .4 . (N orm al ly e igenvalues descend in valu e , but for this

heurist ic example

  the first two

  eigenvalues were exactly equal.) Eigenvalues

are always indices of the amount of information represented in some multi-

variate result .  In  some m ul t iva r ia te analyses e igenvalues  are  mul t iva r i a t e

squared correlation coefficients . Clearly, 'given  that

  the first two

 e igenvalues

associated with

  th e

  factor an alysis were

  both

  greater  than  1.0,

  in EFA

eigenvalues

  are

  not

  squared correlation coefficients .

T he  fo l lowing

 four

  s ta tements apply  to  e igenvalues  in an  EFA:

1.  The  n u m b e r  of  e igenvalues equals  th e  n u m b e r  of  measured

variables being analyzed.

2. The sum of the

  e igenvalues equals

  th e

  n u m b e r

  of

 m easured

variables.

3. An

  e igenvalue divided

  by the

  n u m b e r

  of

  measured var iables

indicates  th e  proport ion  of informat ion  in the  m a t r ix of associ-

ations being analyzed  that

  a

  given factor reproduces.

4. The sum of the

  e igenvalues

  for the

  ext racted factors divid ed

by  th e  num b e r  of measured variables indicates  th e  proportion

of the inform at ion in the m atr ix being analyzed that the factors

as

  a set

  reproduce.

For  th e  current example  th e  number  of  measured variables  is six. The refore,

there  are six  eigenvalues associated with  the  Table  2 .2   corre la t ion m atr ix.

Because the sum ot the e igenv alues for the cu rren t exam ple is 6.0, give n

that

  the first two

  e igenvalues

  are 3.0 and

  3.0,

  the

  remaining e igenvalues

mus t  be  0.0, 0.0, 0.0,  and  0.0.

Because  3.0 / 6  equals 0.5,  the first  e igenvalue indicates

  that

  Factor  I

reproduces

  0 .5 (or

  5 0 % )

  of the

  informat ion conta ined

  in the

  Table

  2 .2

FOUNDATJONAL   CONCEPTS  21

Page 27: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 27/185

HANDSOME

BEAUTIFUL

UGLY

BRILLIANT

SMART

DUMB

DSOME

 BEAUTIFUL

 UGLY

 BRILLIANT SMART

 DUMB

 0

 0

 

0

  0

 

0

 0

 0

 0

 

0

 0

 

0

 0 0

 0 0 0

This  result reflects  the  fact

  that

  fo r

  these

  heurist ic data  the two  factors

alone reproduce

  all the

  information contained within

  the

  original

  Tahle

2 .2

  correlation  matr ix.

SAMPLE SIZE

 CONSIDERATIONS

  IN

EXPLORATORY  FACTOR  A N A L Y S IS

Sample  size

  affects

  the precision of all statistical estimates, inc lud ing

those

  made  in  EFA. V arious researchers have proposed rules  of  t hum h  fo r

sample   size  m i n i m u m s  that

  are a

  funct ion

  of the

  ra t io

  of the

  num b e r

  of

people

  to the

  num b e r

  of

 measured var iables .

 T he

  recommended ratios often

are   wi th in  th e  range  of 10 to 20  people  per  measured variable. Gorsuch

(1983) suggested

  that

  "an absolute minimum ratio is  five  individuals to

every variab le, but not less  t h an  100 in div idu als for any an alys is" (p. 332) .

However, some

 Monte

 Carlo

 s imulat ion research (Gua dagnol i

  &

  Ve l i -

cer, 1988) suggests tha t  th e  most cri t ical issue  is how  saturated  th e  factors

are

  by the

  measured var iables .

 They

  suggested

  that

  replicable factors tend

to be

  es t imated

  if :

1.  factors

  are

  each defined

  by

  four

  or

  more measured variables

with structure coefficients each  greater  than  |.6|, regardless o f

sample size; or

2 .

  factor s are  each  defined with 10 or  more s tru ctu re coefficients

each around |.4|,

  if

 sample  size

  is

  greater

 than

  150;

  or

3.

  sample

  size

  is at

  least 300.

MacCal lum, Widaman, Zhang,

  and

  Hong (1999) found that sample

  sizes

as  low as 60 accu rately reproduced population pattern coefficients if comm u-

nalities were all .60 or greater. If  h  values were arou nd .50, sample

  sizes

 of

100 to 200   were required.

O f

  course, when

  it  comes  to

  m athe ma t ical ly complex analyses such

as  EFA, m ore

  is

 always better. This pr inciple

  is

 par t icular ly re levant , because

one may not  know prior  to  conduc t ing  the  analysis what  the  s a tura t ion

pat tern

  of the

  factors will

  be, and so

  rules revolving aroun d saturation

2 4

  E X P L O R A T O R Y

 A N D

  C O N F I R M A T O R Y

  FACTOR

 A N A LYSIS

Page 28: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 28/185

Page 29: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 29/185

3

EXPLORATORY  FACTOR  ANALYSIS

DECISION   SEQUENCE

Exploratory

  factor analysis (EFA)  in  actuality involves  a  linear  se-

quence  of  decisions each involving  a  menu  of  several available choices.

Given that there are five major decisions, with many choices at each decision

point,

  the

  number

  of  different  analysis

  combinations that

  are  available  to

the  researcher  is quite large. These five decisions, discussed in  more detail

in the  remainder of the  present chapter, address the  following  questions:

1.  Which  matrix  of association coefficients should  be  analyzed ?

2. How

  many  factors  should

  be

 extracted?

3.

  Which  method should  be  used  to extract  the

  factors?

4 -

  H ow

  should

  th e

  factors

  be

 rotated?

5. How

  should factor scores

  be

  computed

  if

  factor scores

  are

of

 interest?

In  EFA, using more than  one set of analytic choices  is usually  best practice.

A s  Gorsuch (1983)

  wisely

  suggested, Factor  th e  data  by several

  different

analytic procedures  and  hold sacred only those factors that appear across

all  the

  procedures used

(p.

  330).

This  chapter presents  the EFA  decision sequence.  Not

  every

  choice

at  each decision point  is described.  But the  most common selections  are

presented.

  And in

  addition

  to

  covering

  the

  most commonly encountered

choices,  th e  presentation  is

 sufficiently

  broad  to  give  th e  reader  a

  flavor

  of

27

Page 30: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 30/185

TABLE 3.1

SPSS  Variable Names   and  Variable Labels   for the  Appendix  A  Data

Variable SP SS variable label

PER1  I

  Willingness

 to

  help users'

PER2

  '1

  Giving users

  individual

  attention'

PER 3 '1 Employees who deal with users in a caring fashion'

PER4

  '1

  Employees

  who are

  consistently courteous'

PER5 '2 A   haven   fo r  quiet  an d   solitude'

PER 6 '2 A meditative place'

PER7 '2 A

 contemplative environment'

PER8

  '2

 Space that facilitates quiet study'

PER9  '3 Com prehensive print collections'

PER10 '3  Complete runs  of   journal titles'

PER11

  '3

  Interdisciplinary library needs being addressed'

PER12

  '3

  Timely document delivery/interlibrary loan'

th e

  complexi ty

  of the  choices.

  Exploratory factor analysis ,

 to pu t it

  mi ld ly ,

is

 not any one

  s ingle analys is—EFA encompasses

 a

 broad

  family of choices.

The  sm a l l , hut  real , data  set presented  in  A p p e n d i x  A  wi l l he  used  to

make this discussion

  concrete. The

  da ta were randomly sampled

  from  the

L i h Q U A L + ™

  s tu dy

 of

 user perceptions

  of

 serv ice qu al i ty

 at

  academic l ibrar-

ie s

 in the

  Uni t ed

 States and Canada

  (cf.

 Cook &. Thompson,

 2001;

  Thomp-

son,

 Cook, &

 Heath,  2001; Thompson,

 Cook, & Thompson,

 2002) . Appen-

d ix

 A  presents  the  ra t ings provided by 100 g ra d u a t e s tu den t s  and 100

 faculty

from

 one LibQUAL+  s t u dy . Table 3.1 presents  the 12 measured var iables

sampled from

  the

  data set.

MATRIX

  OF A S S OC IATION   COEFFICIENTS

The  scores  on  measured var iables are not the  actual basis  for  E F A .

Ins tead , these da t a are used to  compute some matr ix of biv ariate associat ions

am ong the m easured v ariables. I t is the m atrix of associat ions that  is analyzed

within EFA. Indeed, g iven

  the

  m a t r ix

  o f

  associations (e.g.,

  R H

X | I

)  for a

data set ,

  all the

  steps

 of the

  factor analysis (except

  for the

  c om pu t a t ion

  of

factor

  scores) can be

  accomplished

  even

  without

  access  to

  original  data

(e.g.,

  X i o c x i i ) .

Because

  th e

  factors

  are

  ext rac ted

  from

  som e m a t r ix

  of

  associat ions,

the  factors themselves  are  sensit ive only  to

  those

  dynamics captured  in a

given s ta t is t ic m easur ing b ivar ia te re lat ionships .

  If the

  chosen

  stat ist ic only

evalua tes

 the

 orderings created

 by the

  m easured var iables , ignor ing d is tances ,

then the

  factors

  wi l l

  also

 be

 exc lus ively order-dr iven.

 If the

  statistic chosen

to   characterize associat ion  is  in f l u enc ed  b y

  both

  re la t ionships  and  score

variabili t ies,

  then

  the

  factors

  in

  turn wil l also

  be a

  fu nc t ion

  of all

  these

28  E XPL OR AT OR Y AND

  C O N F I R M A T O R Y F A CT O R  ANALYSIS

Page 31: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 31/185

dynamics . And for the

 sam e data set,

 different

  factors

 m ay

 emerge depending

on the  selection  of the  matr ix  of  associations  to he analyzed.

The

  Pearson produ ct-m om ent hiv aria te correlation m atrix

  is the ma-

trix

  of associations most

  commonly used

  in

  EFA.

  In  fact,  in

  most statist ical

packages

  th e

  default

  (used unless

  th e

  user overrides

  th e

  default choice)

matr ix

  of associations used in EFA is the Pearson correlation matrix.

But

  there

  ar e

  quite

  a

  n u m b e r

  of

  other available choices.

  Different

statistics

 that  characterize relationships are sensi tive to  different  aspects of

the data.  Different  relationship s tatistics also presu m e that  different  levels

of

  scale underlie

  th e

  data.

For example,  th e  Pearson  r  requires that da ta mus t he  interv ally scaled.

Spearman's rho,

  on the  other  hand,

  presumes only  that  data

  are at

  least

ordinally scaled.

  But if

 data

  are

  intervally scaled,

  the

  researcher could

  use

either Pearson or Spearman

  coefficients

  as the basis of the ana lysis. In  fact,

Spearman's

  rho is

 Pearson's

 r

  between

  tw o

 variables

 once

  interval data have

been converted into ranks with

  no

  ties.

For example, presume a data set with interval scores of three people

on two variab les:

Part ic ipant  X Y

Barbara  1 1

Deborah

  2 2

Jan

  3 96

The

  Pearson correlation coefficient (r

x

y )

  can be

  c o mp u t e d

 as:

r

XY

  =

  COVxv

  /

  (S D

X

  *

 S D

Y

) ,

where

  the

  covariance

  of X and Y is

  c o mp u t e d

  as :

C O V

X Y

  =

  ( 2 ( X , - X ) ( Y

i

- Y ) ) / n - l ,

or

  equiv alent ly (because

  x, = X; — X and y, = Y,  —  Y) as:

COVxv

  =

  ( I x y , ) / n

  -1.

For these init ial data, we obtain:

Part ic ipant  X X x Y Y y xy

Barbara  1 2.0

  -1.0

  1

  33.0 -32.0 32.0

Deborah

  2 2.0 0.0 2

  33.0 -31.0

  0.0

Jan

  3 2.0 1.0 96  33.0 63.0 63.0

Sum

  6.00

  99.00

  95.00

Mean 2.00

  33.00

SD   1.00 54 .56.

DECISION SEQUENCE  9

Page 32: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 32/185

So the covariance is

 95.00

  / 2 = 47.50. And the Pearson r equals:

47.50

  /

  ( 1 . 0 0 x 5 4 . 5 6 )

47.50 /

 54.56

  = .87.

If

 these interv ally scaled data are conv erted into ranks, the only change

is  that Jan's score

  of 96 on Y

 hecomes

  3

 w h e n

  the

  scores

  are all

  converted

into  ranks.  Now we can  c o mp u t e  th e  Spearman's  rh o  using  th e  Pearson

r

  formula:

Part icipant

Barbara

Deborah

J a n

Sum

M e a n

SD

X

1

2

3

6.00

2.00

1.00

X

2.0

2.0

2.0

X

-1.0

0.0

1.0

Y

1

2

3

6.00

2.00

1.00

Y

2.0

2.0

2.0

y

-1.0

0.0

1.0

xy

1.0

0.0

1.0

2.00

The co va rian ce is 2.00 / 2 = 1.00. A nd the Pearson r equals:

L O O /

  (1.00

  x

  1.00)

l . O O /

  1.00

  =

  1.00.

In

 effect,

  w hether computed on ordinal or on interv al data, Spearman's

rh o

 addresses

 the

 qu est ion,

 "Do the two

 v ariables order

 th e

  people

  in

 exact ly

th e

  same order?" Pearson's  r   evaluates this question also,

  but

  takes into

account

  as

  wel l

  th e

  distances between

  th e

  ordered scores. Spearman's

  rh o

presumes

 that

  no

  such informat ion

 is

 present

  in

 da ta

 (or

 ignores this

  informa-

t ion) , which is w hy rho can be used w h e n both variables are ordin ally scaled.

These

 two m atrices of association are two of the

 choices

 that researchers

can use in

  extracting factors.

  But a

  third choice

  is the

  mat r ix

 of

 c ovariances

itself.  In  fact,  th e  covariance matr ix  is the  most common choice  as the

matrix of

  associations investigated

  in

  confirmatory  factor

  analysis

  (CFA).

From the

  formula

  T

X

Y

 =  CO VX Y  /  ($ DX  *  S DY ),  we can see

  that

  rX Y

will

  equal

  CO V

X

Y

  if either  (a) r (or the  covar iance)  is zero,  or (b) the two

standard

  deviat ions  are  reciprocals  of  each other (e.g.,  1 and 1, .5 and 2).

In all

 other  cases,

  r

x

y ?t /

  CO V

XY

In   many contexts  th e  covariance  is used  primari ly  as an  intermediate

calculat ion  in  obtaining  th e  correlation  coefficient,  and not to  describe

relationship or

  association.

  The

  covariance

  is  used  infrequently

  because,

un l i ke  r, the covariance does not have a defini t ive  range of possible

 values.

For

  example, consider

  th e

  following  combinat ions

  of r and SD

 values:

30  E X P L O R A T O R Y AND  C O NFI RM A TO RY FA CTO R A NA LY SI S

Page 33: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 33/185

r

XY

  * SD

X

  * SD

Y

  =

  CO V

X Y

1.00

  * 100 * 100 =

  10000

0.50

  * 100 * 200 =  10000

0.01

  *

  1000

  *

  1000

  =  10000.

Clearly the covariance can equal a giv en valu e und er a wid e array of

circumstances

The covariance is joi ntl y influenced by three  aspects of the two vari-

ables:  th e  correlation between  the two  variables,  th e  var iabi l i ty  of the

first variable,  and the variability of the second variable. Therefore, when

exploratory  factors  are  extracted  from  a  covariance matrix, some factors

m ay  be a function of correlations, whereas others may be more a function

of  score spreadoutness. Sometimes we want our factors to be sensitive to

an   array  of aspects of the scores. But at

  other

  times we may prefer all the

factors

  to be sensitive only to a single aspect of our data.

N U M B E R   O F FACTORS

A

  critical decision

  in any EFA is

  determining

  how

  many factors

  to

retain. There  are numerous strategies for making this decision. In general,

several strategies should   be  used with  the  hope  that  different approaches

to   making this decision  will  corroborate  each other.  Although empirical

evidence

  can

  inform

  this

  jud gm ent, these decisions

  are in the final

 analysis

matters of exactly that:  judg m ent. Zw ick and V elicer (1986) presented what

is

  the most authoritative empirical evaluation of how well these various

strategies work.

Statistical  Significance  Tests

Statistical significanc e tests

 d ue to

 Bartlett (1950)

 can be

 used

 in

  either

of  tw o

  ways.

  First, the correlation matrix can be tested to evaluate whether

the matrix is an identity matrix. If this null hypothesis  that  the correlation

matrix

  is an identity ma trix

 cannot

 be rejected,

  then

  factors

 cannot

 sensibly

be  extracted

  from

  th e  matr ix .  Second,

  after

  each factor  is  extracted, with

factors  being extracted successively so as to  contain  progressively less and

less  inform ation or variance, the residu al correlation m atrix can b e analyze d

to evaluate whether any information remains (i .e. , whether the residual

matrix

  is  essentially a n  identi ty ma tr ix) .

The  problem with these applications  is the  same general problem  that

applies

  in all

  statis tica l significance tests (see Thompson, 1996). Statistical

significance

  is driven  in  large part  by sample

 size.

  Because researchers gener-

ally

  use EFA

  only with reasonably large samples, even trivial correlations

DECISION SEQUENCE   31

Page 34: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 34/185

Page 35: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 35/185

  Series'

r-

o>

  i-

Factor

Figure  3.1.  Scree plot

  for

  first

 100

 cases

  and 11

 variables presented

  in

 Appendix

 A.

recognize  and  retain.  Trivial factors, however,  are  analogous  to  scree,  and

should he  left  hehind in the extraction process.

A   scree plot graphs eigenvalue magnitudes  on the  vertical access, with

eigenvalue numbers const i tut ing  th e  horizontal axis.  T he  eigenvalues  are

plotted as asterisks w ithin the graph, and suc cessive valu es are connec ted

by  a line. Factor extraction should be stopped at the point where there is

an  "elbow,"  or  leveling  of the  plot.  This  visual approach,  not  invoking

statistical significance,  is  sometimes called  a  "pencil test," because  a  pencil

can be  laid  on the  rightmost portion  of the  relevant graphic  to  determine

where  th e  elbow  or  flattening seems  to  occur.

Statistical

  pack ages prov ide the scree plot on requ est. Of c ou rse, be-

cause

  this strategy requires judgment, different researchers given the same

plot  m ay  d isagree regarding what number  of  factors should  be  retained.

Figure

 3.1 presents the scree plot

  associated  with

  the

  analysis

 of the

first 100 cases  of data  for the  first  11 va riab les listed  in  A p p e n d ix  A .  These

data are perceptions of 100 grad ua te studen ts regarding

 library

 service quality

on the first 11

  variables listed

  in

  Table 3.1.

  This

 plot  suggests that  three

factors  shou ld be extracted. Em piric al varia tions on this  "visual  scree" test

use  regression analyses  to  evaluate objectively where  th e  elbow  in the  plot

occurs (Nasser, Benson,  &  Wisenbaker, 2002).

 nspe tion

  of the  Residual Correlation Matrix

A s noted  previously,  as  more factors  are  extracted  th e  entries  in the

res idual  correlation matrix

  (R

v x

 v)

  approach zeroes.

  If all

  possible factors

are extracted,  then  the resid ual m atrix

  wi l l

  always consist only of zeroes.

DECISION

  SEQUENCE

33

Page 36: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 36/185

Logically, another approach  to  de te rmin ing the  n u m b e r of notewor thy

factors  involves examination

  of the

  res idual matr ix

  as

 successive factors

  are

extracted. Compu ter packages provide the  res idual matr ix on  request . And

some packages count  th e  n u m h e r  of  entries  in one  triangle  of the  res idua l

correla t ion matr ix that

  are

 greater

 than

  |.05|

  to

  help inform this eva luation .

N o te

  that

  this criterion has nothing to do  wi th p < .05 and that the  cutoff

va lue  is  somew hat arbi trary. Researchers  m ay  invoke other  cr i ter ia ,  and

m ight elect to use larger cutoffs  when sample sizes are sm all and the standard

errors of the

  correlation coefficients

 a re

  larger.

Parallel

  nalysis

Horn (1965) proposed

  a

  strategy called  parallel analysis  that appears

to be  among  the  best methods  for deciding  how  many factors to  extract  or

reta in (Zwick

 &

  Velicer,

  1986).

  The

  strategy involves taking

  the

  actual

scores on the m easured variab les and creating a random score m atrix of

exactly  th e  same rank  as the  actual data with scores  of the  same type

represented   in the  data set.

 There

  are  various approaches  to  this s tra tegy.

The

  focus

 of the  method  is to create eigenv alues

 that

  take into account

th e sampling error that  influences a g iven set of measured v ar iables . If scores are

random ly ordered , the related correlation m atrix  wi l l approximate an id enti ty

matr i x , and the  e igenva lues of that m atr ix

 wi l l

 f luc tua te around  1.0 as a

  func-

tion

 of sampling

  error.

 The  range of fluctuations will tend to be narrower as

th e  sample  size  is larger and the  n u m b e r  of measured var iables is sm aller.

If

 one presumes

 that

 actual d ata are approximately normally dis tr ib uted ,

then  the  short

  SPSS

  syntax presented  by Thompson  and  Daniel (1996)

m ay

  be

 used

 to

  conduc t

  th e

  analysis.

 This

  program creates randomly ordered

scores consisting  of the  same intege r valu es (e.g.,  1, 2, 3, 4, or 5)  represented

in   some data sets such

  as

  Likert scale data.

Other approaches use regression equation s pu blish ed in prior sim ulatio n

studies  to

  pred ic t

  th e

  eigenvalues that would

 be

  produced

  from  th e

  correla-

t ion m atr ices for random ly ordered d ata having a cer ta in rank. M ontanell i

and Humphreys (1976) produced one such equation  that uses as inpu t only

the ac tual sample

 size

 and the nu m ber of measured var iables . Lautenschlager ,

Lance ,

  and

 F laher ty (1989) pub lished

  an

 al ternative  formula  that uses these

tw o

  predictors

  bu t

  also uses

 the

  rat io

 of

 var iables-to-par t ic ipants

 as another

predic tor var iable .

A  more sophisticated approach exactly  honors not  only  the  types of

possible

  values contained  in the  da ta  set but  also  the  exact values (and

their d escriptive statistics includ ing shape statistic s such as skewness) in

th e  real data set.  Table  3.2  presents  the first 25  scores  from  A p p e n d i x  A

on the

  variables PERI, PER2, PER5,

 and

  PER6.

 The data on these

  variables

34  E XPL OR AT OR Y AND  C O N F I R M A T O R Y FA C T O R  ANALYSIS

Page 37: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 37/185

TABLE 3 2

Variables Randomly Sorted  for  Parallel Analysis

Original data

ID

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

Mean

SD

PER1

8

5

6

5

5

7

8

7

1

9

9

4

6

7

9

9

8

8

9

8

9

7

7

8

8

7.08

1.93

PER2

7

7

5

5

5

7

8

7

3

9

9

3

6

7

9

9

8

5

9

7

9

7

8

9

8

7.04

1.84

PER5

3

4

5

4

5

7

6

5

1

5

7

6

7

2

7

5

7

4

6

5

9

7

5

5

6

5.32

1.75

PER6

2

5

3

4

4

8

7

3

1

7

9

5

7

3

5

5

6

8

7

6

9

6

6

3

6

5.40

2.14

Randomly sorted data

RAN1

9

8

7

5

5

5

9

9

8

8

8

7

1

8

7

9

9

7

6

4

8

7

6

8

9

7.08

1.93

RAN2

5

9

5

7

5

5

9

7

9

3

8

3

7

9

9

7

7

7

6

8

8

7

9

8

9

7.04

1.84

RAN5

3

7

5

4

4

5

5

6

2

5

1

7

7

5

7

 

9

6

4

7

6

7

5

6

5

5.32

1.75

RAN6

6

4

5

4

7

3

3

5

8

7

9

6

1

3

5

7

3

6

6

2

6

5

9

8

7

5.40

2.14

were  independen t ly randomly ordered  to  create  th e  scores  on  va r iab les

RANI,

  R A N 2 ,  RAN5, and RAN6.

N o te  that  the  means  and  s tandard deviat ions  on  rela ted actual  and

randomly ordered data match exactly (e.g.,  X P H R I  =  X R A N I

  =

  7.08;  SDpgRj  =

SD^AMi

  =

  1-93). Other d escriptive statistics, such  as coefficients  of skewness

and   kur tos is ,

 wou l d

  also match  fo r each  ac tua l  and  randomly ordered  vari-

able  pair .

However, randomly ordered scores should have bivariate correlations

approaching zero with eigenvalues all  f luc tua t ing  around one.  Table  3 .3

presents  the  correlation matrix for the  actual data presented  in Table  3.2,

and correlation

  coefficients

  for the same data  after  independent random

ordering of the  scores on each var iable .

 Note that

 the correlation

  coefficients

for

  the

  randomly ordered var iables

 are

  closer

  to

  zero

 than  the

  correlations

among

  the

  actual scores.

DECISION

  S E Q U E N C E

  35

Page 38: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 38/185

TABLE

  3.3

Bivariate  Correlations For Actual and Randomly Ordered Data

PER1/ PER2/ PER5/ PER6/

Variable

  RAN1 RAN2 RAN5 RAN6

PER1/RAN1

  .15

  -.13

  .41

PER2/RAN2  .87 .01

  -.02

PER5/RAN5  .51 .46  -.55

PER 6/RAN6 .54 .43 .73

Note.  Th e

 coefficients above

 t he

 diagonal

  are

 correlations among

  the

 randomly ordered data.

 The co-

efficients

 below th e diagonal  are   correlations among  the  actual data  in its original order as   presented  in

Table 3.2.

TABLE 3.4

Eigenvalues  for Actual and  Randomly Ordered Data

Number

1

2

3

4

Sum

Actual

 data eigenvalue

2.776

0.828

0.276

0.119

4.000

Random

 order eigenvalue

1.756

1.082

0.797

0.364

4.000

In

  parallel analysis

  the

  eigenvalues

  in

 successive pairs  from

  the

  actual

and

 the  randomly ordered data are compared.  Factors are retained whenever

the

  eigenvalue

 for the

  actual data

 for a

 given factor exceeds

  the

  eigenvalue

for

  the

  related factor

  for the

  randomly ordered scores.

  These

  eigenvalues

for

  the

 Table

 3.2 data and the  Table 3.3  correlation matrices are  presented

in  Table  3.4.

For  this example, because 2.776  is greater  than  1.756,  the first  factor

from

  the

  actual data would

 be

  extracted. Because 0.828

  is

 less than  1.082,

the  second factor would  not be  extracted.

Brian

 O'Connor

 (2000)

 has

 written several

 SPSS and SAS

 syntax

 files

to  execute variations  on

  these

  analyses. The  programs  can  also  be  down-

loaded  from  http://flash.lakeheadu.ca/~boconno2/nfactors.html

FACTOR  EXTRACTION METHOD

There

  are

  numerous statistical theories

  that  can be

  used

  to

  compute

factor

 pattern  coefficients.  Probably the most frequently used EFA extraction

method, perhaps because  it is the default analysis in most statistical packages,

is called  pr incipal  components  analysis  (see Russell, 2002).

Principal components analysis assumes  that  the  scores  on  measured

variables have perfect reliability.

 Of

 course, scores

 are

 never perfectly reliable

36  E X P L O R A T O R Y  A N D  CONFIRMATORY FACTOR

  N LYSIS

Page 39: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 39/185

(see Thompson,

 2003).

  Furthermore,  th e  analysis attem pts  to reproduce  th e

variance

  or information in the  sample  data, rather  than  the population. Of

course, if the  sample  is  reasonably representative  of the  population data,

th e  sample factors should tend  to  match  th e  population factors.

Principal

  components  analysis, which actually

  was the

  analysis

 d e-

scrihed  in

  chapter

  2,

  uses ones

  on the

  diagonal

  of the

  correlation

  mat r ix .

This seems reasonable  to the  extent  that scores o n a given m easured variable

should   correlate  perfectly  with the same scores on the same  variable.

However, if scores on a variable are not perfectly reliable, then  cor-

relations

  of

  scores

  on a

  measured var iable with  itself  will

  not be

  perfect

(i.e.,  +1.0)

  if

  measurement error

  is

  taken into account. Logically,

 it

  might

be  reasonable to alter the d iag on al entries of the c orrelation m atri x to take

measurement error

  into

  account.

It was

 noted previously that co m m un ality coefficients

 are

 lower-bound

(or

  conservative) estimates  of  score  reliability.  Another  common factor

extraction method,  principal  axes factor   analysis uses

 the

  communal i ty coeffi-

cients  to  replace  th e  ones  on the  diagonal  of the  correlation matrix.

Principal axes factor analysis often starts w ith

  a

  principal components

analysis.

 Then the

  c o m m un a l ity coefficients

  from

  this analysis

 are

 su bst i tu ted

respectively  for the ones on the diagonal of the original correlation matrix.

It

  is

  im portant

  to

  emphasize that

  no

  off-diagonal  data

  are

  altered

  in

  this

process.  Then

  a new set of

  factors

  and

  their corresponding com m unali ty

coefficients  are

  estimated.

This

  process

  is

 successively repeated un til

  th e

  communal i ty es t imates

stabilize. This  process  is  called  i terat ion. Iteration  is a  statistical process of

estimating a set of  results, an d  then

  successively

  tweaking results  un t i l  tw o

consecutive sets

 of

 estimates

 are

 su fficiently  close

 that the

  iteration

  is

 deemed

to

  have converged. Iteration

  is

 used

  in

 var ious ways  throughout  mult ivar ia te

statistics,

  inc luding

 in

  other aspects

  of

  EFA, such

  as

  factor rotation, which

is  discussed in the next section.

The principal axes communality iteration process does not  always

converge. Estimates may sequentially bounce u p, bounce dow n, bounce up,

and so  forth.  If  this happens, often  the  sample  size  is too  small  for the

num ber of m easured variables and the m odel being estimated.

Always  carefully  check whether i teration

  has

  converged. Some com-

puter  programs print only innocuous-sounding warnings about  failure  to

converge, and  then  still

  pr in t  full  results.

  So the  unwary

  analyst

  may not

notice

  a

 convergence problem. Programs also have b ui l t- in maxim um nu m-

bers of

  iterations  (e.g.,  25) .

  If a

  solution does

  not

  converge, override

  th e

default  n um ber  of  i terations. If a  solu tion does  no t  converge  in 100 or 125

iterations, u sually the solution  will not converge at any num be r of iterations.

Either

  a  different  factor  extract ion method must be  selected  or  sample  size

must

  be

  increased.

DECISION   SEQUENCE   37

Page 40: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 40/185

Principal

  components

  analysis

 does

 not

 have

  to be the  f i r s t  step  of the

principal axes factor an alysis. An y reliab ility estimates  that  the researcher

deems reasonable can he used as  initial  estimates. For example, it the mea-

sured

  va riab les are test scores, the relia bil ity coeffic ients reported either in

test ma nu als or in a prior "reliability generalization" (R G ) an alysis (V acha -

Haase, 1998)  can be  used  as  initial  estimates placed  on the  diagonal  of the

correlation matrix.

Another  strategy  for  creating initial estimates  of com munal i ty  coeffi-

cients is available in most statistical packages. The multiple

  R

2

  of a given

measured variable with all the other measured variab les is used as the initia l

estimate

  for that

  variable's

 h

2

  (e.g.,

  h

}

2

  =

 R

2

P E

R i

  x

  P E R 2 , P E R 5 , P E R 6 ,   ^ =

 R

2

p E R 2 x

P E R I , P E R 5 , P E R 6 ) -

In theory, what one u ses as the ini tial com m un ality coefficients in this

iteration process

  will

  affect

  how

  many iterations must

  be

  performed (with

princip al components inv okin g ones  on the  diagonal  of the  correlation

matrix obviously taking

  th e

  most iterations)

  but not

  what

  are the final

estimates.

  If

 this

  is so,

  given that

  the

  only downside

  of

  numerous iterations

is

  wasting a little electricity, it  doesn't  matter too much which initial

estimation procedure

  is

 used.

Other  factor extraction method s includ e alpha factor analysis, maxi-

m um

  likelihood factor analysis, image factor analysis,

 a nd

  canonical factor

analysis.

  Alpha   factor analysis  focuses  on  creating factors with maximum

reliability. Max imum   likelihood

  analysis  focuses

 on

  c reating factors that repro-

duce

  th e

  correlation

  or

  covariance matrix

  in the

  population, versus

  in the

sample.  Image factor analysis  focuses

 on

  creating factors

 of the

  latent variables

that  exclude

  or

 m inim ize "unique factors" consisting

  of

 essentially only

  on e

measured variable. Canonical

 factor

  analysis  seeks  to  identify  factors  that  are

maximally  related to the measured variables.

FACTOR  ROTATION

Factor rotation involves mov ing

 th e

 factor axes measuring

 the

  locations

of  the  measured variables  in the  factor space  so

  that

  the  nature  of the

underlying  constructs becomes more obvious  to the  researcher. Factor rota-

tion

  can be

  performed either graphically, guided only

  by the

  researcher's

visual jud gm ent, or em pirically using one of the d ozens of

 different

 algorithms

that  have

  been

  developed over  th e  years. Analytical rotation  has  some

advantages over graphical rotation, because

  th e

  procedure

  is

  automated,

and

  because

  any two

  researchers analyzing

 the

  same data  will  arrive

  at the

same factor struc ture when using the same em pirical rotation method.

38

  EXP LOR ATOR Y

  A N D

  C O N F I R M A T O R Y

  F ACTOR

  AN ALYSIS

Page 41: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 41/185

TABLE 3.5

Correlation

  and

  Squared Correlation Coefficients

  for 6

 Variables

  and

100  Participants

Variable

PER1

PER2

PER3

PER5

PER6

PER7

PER1

1.00

.85

.78

.40

.32

.48

PER2

72.3%

1.00

.72

.45

.32

.50

PER3

60.8%

51.8

1.00

.51

.40

.50

PER5

16.0%

20.3%

26.0%

1.00

.66

.80

PER6

10.2%

10.2%

16.0%

43.6%

1.00

.71

PER7

23.0%

25.0%

25.0%

64.0

50.4

1.00

Note.

  Th e  (unsquared)

  correlation coefficients

  are

  presented

  in the

  bottom

  triangle. T he  squared

  correla-

tion

 coefficients are presented as

 percentages

 in the top

 triangle;

 values

 above

 43% are italicized.

Rotation

  is not

  possible when only

  one

  factor

  is

  extracted.

  But in

vir tual ly

  all  cases involving  two or m ore factors, rotation  is u sually essential

to

  interpretation.

T he  first  100  scores  on six  LihQUAL+™  va r iab les (PERI , PER2,

PER3, PER5, PER6,

  and

  PER?) presented

  in

  Appendix

  A can be

  used

  to

demonstrate

  w hy

  rotation

  is so

  essential

  in

  EFA.  Table

  3.5

  presents

  th e

bivar iate  r and r

  valu es associated w ith

 these

  data .

A n  examination  of the r  va lues in Table  3.5  ( r e m e m b e r

 that

  r values

mus t

  be  squared before their magnitudes  can be  compared  in an  in terval

metric) suggests  (a) the  existence  of two  factors, consisting respectively of

PE RI , PE R2, and PER3, and of PER5, PER6, and PER 7, because a ll pairs

of

  th e

  variables

  in these  tw o

  sets have

  r

  values greater

  than

  4 3 % ,

  and (b)

the two

  factors probably

  are not

  highly correlated, because

  the r

  va lues

between measured variables across  the two  sets  are all  less than  2 7 %.  T he

unrotated pattern /s tructure coeff ic ients

  and the

  eigenvalues presented

  in

Table

  3.6 confirm  th e  expectat ion  of two  factors. The two  factors reproduc e

84.2% (3.81  +  1 .23  =  5.05  / 6 =  .842)  of the  var iance  in the  correlation

matrix involving  the six  measured variables.

However ,

  th e

  u nrotated factor pattern /s tructu re  coefficients  presented

in  Table

  3.6 do

  n o t  match

  ou r

  expectations regarding

  th e

  composition

  of

th e  factors.  For  example ,  as  reported  in the  table,  all six  variables  are

correlated

  .70 or

  more with Factor

  I Yet the

  F igure

 3 .2

  plot

  of the

  factor

space, based   on the  p attern/ struc ture coefficients reported  in Table  3.6,  d o

match  ou r  expectat ions regarding factor s tructure.

  This

  suggests

 that

  the

patterns  in our  unrotated results  are  exactly what  w e  expect,  b u t  that  th e

patterns

  cannot

  be readily discerned

  from

  statistical results in the  form  of

unrotated results .

The  idea  of  rotation  is  that  th e  locations  of the  axes  in the  factor

space  are com pletely arbi trary. W e can  move (i.e., rotate) these axes  to any

DECISION  S E Q U E N C E  39

Page 42: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 42/185

TABLE 3.6

Unrotated Principal Components

  Pattern/Structure Coe fficients

Unrelated  factors

Squared

  structure

Variable

PER1

PER2

PER3

PER5

PER6

PER7

X

I

0.81

0.81

0.82

0.80

0.70

0.83

II

-0.50

-0.46

-0.35

0.43

0.54

0.41

I

65.9%

65.6%

67.9%

63.6%

49.2%

69.3%

3.81

II

25.3%

20.9%

12.1%

18.8%

29.7%

16.4%

1.23

h

2

91 .2%

86.5%

80.0%

82.4%

78.9%

85.7%

Note.  Pattern/structure coefficients

  >

 |.35|

  are

  italicized.

  For the

  principal  components model

  the

  eigenval-

ue s

 equal

  the

 sums

 of the squared unrelated

 pattern/structure coefficients

  for a

 given  factor

  (e.g.,

  .659

  +

.656  +  .679  +  .636  +  .492  +  .693  = 3.81).

O

 

1

0.5 -

n

1   -0.5

 

-0.5

1

 

0.5

  A

 

Series

 1

Factor

 I

Figure 3.2. Table

  3.6

 variables presented

  in

 factor space  before  rotation.

location, and in any manner we wish. However, we may not move the

location

  of any

  variable

  in the

  factor  space.

Persons first learning of rotation are often squeamish about the ethics

of this proced ure. Ce rtainly ethics have

 an

  important bearing

 on the

  exercise

of statistical choice.  If w e have several mice  in a d rug experiment, a nd  Elwood

in the e xperim ental group is not responding w ell to the new intervention, i t

is  unethical  to  take Elwood  out for a  li ttle  walk a t  3:00 a.m. that results  in

Elwood's "accidental" demise.  That

  is

  unethical

  scientific

  conduct (and

cruelty  to  Elwood) . B ut  factor  rotation  is not in any  respect unethical  (a s

long as we only move fac tor axes and not measu red va riable s).

Table 3.7 presents an exam ple of an em pirical rotation of the Table

3.6

  pattern/s tructure

  coefficient

  matr ix .

 Note

  that (a) the communality

coefficients  fo r each  of the  variables rem ain unaltere d by  rotation  and (b)

the  total variance reproduced  in the  factors as a set  (84.2%  =  [2.59 + 2 .46 =

5.05

  / 6 =

  .842]) remains unaltered.

40

E XPL OR AT OR Y  A N D  C O N F I R M A T O R Y FACTOR  ANALYSIS

Page 43: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 43/185

TABLE

 3.7

Rotated  Principal Components Pattern/Structure Coe fficients

Variable

PER1

PER2

PER3

PER5

PER6

PER7

Trace

Rotated factors

I II

0.94   0.19

0.90  0.23

0.84 0.31

0.28  0.86

0.13  0.88

0.32 0.87

Squared structure

I II

87.5% 3.7%

81.4% 5.1%

70.1% 9.9%

7.9% 74.5%

1.8% 77.1%

10.6% 75.2%

2.59 2.46

/i

2

91 .2%

86.5%

80.0%

82.4%

78.9%

85.7%

Note.  Pattern/structure coefficients

  >

 |.35|

  are

  italicized.

  For the

  principal components

  model

 th e

  trace vari-

ance

 reproduced by the

 rotated factors  equals

 the

 sums

 of the

 squared rotated pattern/structure  coeffi-

cients for a

 given

 factor (e.g.,  .875  +  .814 +  .701  +  .079  +  .018  + .106  = 2.59).

However,

  it is

 true  that  rotation tends

  to

  redist r ibute

  th e

  location

  of

variance within  th e  solut ion.  In the  unrotated solution  the first  factor

reproduces   th e  b u l k  of the  observed variance (3.81  / 6 =  63.5%).  In the

rotated solution  the two  rotated factors reproduce 43.2% (2.59  / 6) and

41.0% (2.46 / 6) of the  observed v ariance.  The  rotation  has almost equalized

the  location  of the  informat ion within  the  solution. This  tends  to  happen

to the  extent  that  rotations  are  performed  in  pursu i t o f  "simple structure."

Thurstone (1935,

  p.

  156, 1947,

  p.

  335) proposed

  the

  basic concept

 of

simple structure

  and the

  associated criteria

  fo r

  evaluating simple structure.

Think  about what set of factor pattern coefficients would maximize our

abil i ty

  to

  interpret

  th e

  na ture

  of the

  latent construc ts und erlyin g scores

  on

measured variables.

T he  un rotated results presented  in Table 3.6 w ere virtu ally impossible

to

  interpret because

  all of the

  m easured v ariab les were very highly correlated

with

  both

  factors. A more interp retation -frien dly solution w ould strike a

balance between hav ing enoug h m easured variables correlated w ith

  a

 factor

that  some measured variables reveal the underlying nature of a factor, but

not so  many  that  th e  factors cannot  be  d is t inguished  from  each  other  or

are so complex

  that

  their nature

 cannot

  easily he apprehended.

A   s imple s t ructure would involve

  from

  a  column perspective roughly

an equal number of measured variables being appreciably correlated with

each  factor (e.g., 6/2 = 3), bu t  from  a row perspective m ost  or all measured

variables being appreciably correlated with only   one  factor.

 These

  criteria

are met

  very well

  for the

  rotated factor solution presented

  in Table

  3.7.

Empir ical  (i .e.,  nongraphical) rotation methods perform

  the

  rotation

by applying an  algorithm  to  derive a  transformation matrix  ( T p x p )  b y  which

the  unrotated pattern  coefficient  mat r ix  ( P v x p )  is postm ul t ipl ied  to  yield

the rotated factors. There are dozens of formulas  for deriv ing T p

x

p matrices,

DECISION SEQUENCE

  41

Page 44: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 44/185

several of w hich are availab le in various statistical packages. These c o m p u te r

programs  wil l

  pr in t  the  transformation matr ices  on  request.  The  more  a

t ransformation  matr ix deviates

  from

  be ing an  identi ty matr ix, the  more  the

unrotated versus  th e  rotated factor pattern  coefficients

  wi l l

  b e  different .

Orthogonal Rotation

Figure

 3.2 and Table 3.7  involve a case  in which  th e  original 90-degree

angle between  the two factors was m ain ta ined  in the  rotation  process. This

means that the rotated factors   wi l l  remain  orthogonal or uncorrela ted.

Factors are always orthogonal upon e xtraction, and rem ain unc orrelated if

orthogonal  rotation is used.

The correlation

  coefficient

  ac tu ally is a fun ctio n of the cosine of the

angle between ei ther measured or composi te var iables .

 When

  two var iables

have been standardized  to  have  a  length  or  var iance  of  1.0,  as  factors are,

the cosine of the angle b etw een tw o factors is the correlation o f the

  factors

(see  Gorsuch, 1983, chap.  4).

The

  most common or thogonal rota t ion method,

 and

  indeed

  the

  most

common rotat ion of a n y  ki nd , is the  v a r imax   rota t ion method developed by

Kaiser

  (1958) .

  This  is the

  default rotation

  in

  most statistical packages.

V a r i m a x tends to  focus on  m a x i m i z i n g the

  differences

  between  the  squared

pattern /s tructure  coefficients  on a factor ( i .e. , focuses  on a column perspec-

tive ). In my experience, in about 85 % of exploratory factor analyses va r i max

will yield  s imple s tructure.

Another

  orthogonal rotation method

  is

  quar t imax

which tends

  to

focus  on  s imple s t ruc tu re

 pr imar i ly

  from  th e  perspective  of  rows (i .e. ,  fo r

variables  across

  th e

  factors) . Quart imax

  is

 most appropriate w hen

  the first

factor  is expected to be a large "G" or general  factor  that is saturate d by

disproportionately many of the variables.

 E q u a m a x

  is a compromise betwe en

var i max  an d  quar t imax methods .

Oblique Rotation

In some cases s imple s truc ture cannot be obtained us ing or thogonal

rotation. Typically this  is  indicated  by  var iables having pattern /s tructure

coefficients  that  are

  large

  in

  absolute value

  on two or

  more factors (which

is  sometimes cal led mult ivocal v s.  un ivoca l ) .  Such  factors  are  difficult  to

interpret. The  researcher must then  tu rn  to  oblique rota t ion in the  pursu i t

of  s imple s t ruc tu re .

However , as noted  in chapter  2, w hen factors  are  rotated orthogonally,

the

  pa t te rn  coefficient

  and the

  s tructu re coeff ic ient m atr ices , which  always

have  the  same ran k, also

 wil l

 contain  exactly the  same numbers . This  occurs

because when factors are uncorrelated the interfactor correlation matrix

42   EXPLORATORY A N D C O N F I R M A T O R Y  FACTOR

 ANALYSIS

Page 45: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 45/185

p )

  is an  identity matrix,  and  given that  P y

x

p  R F X F

  =

  S v x F i  if  R p x F

  =

I p

x F

, then  necessarily P

V x

F

  =

  S y

x

p .

But once

  factors

  are

  rotated obliquely,

 th e

  factors

  are

  correlated,

  and

P\/ x F Sy

 x

  F -

 Correlations among  the factors  that deviate further

  from

  zero

lead

  to greater

 differences

  between the pattern and the structure

  coefficients.

A nd

  once factors

 are

 rotated o bliqu ely,

 th e

  pattern coefficients

 are no

  longer

correlation

  coefficients.

  On ob lique ly rotated factors, any or even all of the

pattern coefficients  can be  outs ide  th e  range

  —1

 to +1.

When  orthogonal rotation  is performed  in  computer packages,

  R F

X

F

will not be printed, because the  interfactor correlation  is by d efini t ion known

to be the identify  matr ix . And  because  PV x F equals  Sy

 x

 F. only  one  factor

matr ix

  wi l l

  be

  printed, which

  may be

  somewhat  ubiquitously  labeled some-

thing like "varimax-rotated factor matrix."  When

  an

  oblique rotation

  is

performed,  all

 three matrices

  (Py

  x

  F >

 RF x

 F )

 and Sy

  x

 F)

 will always

 be

  printed,

so

  that

  the

  results

  can be

  correctly interpreted

  by

  consult ing

  both

  Py

  x

 p

a n d S

V x F

.

When

  oblique rotation is necessary, p r om a x   rotation (Hendrickson &

White, 1964)  is almost  always  a  good choice. Promax  is actually conducted

as   a series of rotations. The  first  rotatio n is an orthogonal rotation, such

as   var imax . These  pa t tern /s t ruc ture  coefficients  are  then  raised  to some

exponential power,

 k

which  is sometimes called  a pivot power. If an  even-

numbered pivot power (e.g.,

  k =

  4) is  used,  the  signs  of the  resulting

coefficients

  are  restored after  the

  exponential  operation

  is performed .

Raising

 var imax pattern/s tructure coefficients

  to an

  exponential power

makes

 all the

  resulting coefficients  closer

 to

 zero.

 But the  effect  is

 differential

across

  origin al value s that

  ar e

  larger

  in

  magnitude versus smaller

  in

  magni-

tude.  Fo r  example,  .9

4

  is  .656, whereas  ,2

4

  is .002.  The  resulting coefficients

tend  to  have simple structure.

N e xt ,  a "Procrustean" rotation is invok ed . Procru stean rotation can

also  be  useful  as a rotation method on its own merits and not as part of

promax  rotation,

  as

 when

  a

  researcher wants

  to

  rotate factors

  to

  best

  fit a

theoretical structure (Thompson  &  Pitts, 1982),  or  when conducting  an

EFA

 meta-analysis (Thompson, 1989). Indeed , some test distrib utio ns have

been developed

  fo r

 s tatis tically

  evaluating

 the fit of

 actual

 and

 theoret ical ly

expected structures,

  or of

  actual structures

  from

  d ifferent studies (Thomp-

son, 1992b).

Procrustean rotation derives its name   from  the Greek myth about an

innkee per w ith only one bed . If gu ests we re too short for the bed, he stretched

the  guests  on a  rack.  If guests were  too  tall  for the  bed ,  he cut off parts of

their legs.

Procrustean rotation  finds the  best  fit of an  actual matrix  to  some

target matrix (e.g., a hypothetically expected pattern matrix, the factor

solution  for the  same variables  in a  prior

  s tudy) .

  In the  promax  use of

DECISION   SEQUENCE

  43

Page 46: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 46/185

Procrustean rotation,

  th e

  varim ax pattern/struc ture coefficients each raised

to the  pivot power hecome  th e  target.  T he  resu lting Procrustean rotation

i s   the promax solution.

The  researcher can  control  the  degree of correlation am ong  th e factors

( R p

x

p )

  hy

 changing

  th e

  pivot power.

  T he

  higher

  the

  exponent,

  the

  greater

will  be the

  degree

  of

  correlation among

  the

  factors.

Another oblique rotation method

  is oblimin.

 Ob limin invokes

  a

 va lue

called   delta

  to

  control

  the

  degree

  ot

  correlation among

  th e

  rotated factors.

Delta values

  of

  zero yield  factors

  that  are

  more highly correlated, whereas

large  negative valu es produ ce factors that

  are

  more uncorrelated. However,

in  cases where oblique solutions  are  required, promax  is

  usually

  a  very

good choice.

FACTOR   SCORES

As

 noted

 in c hapter 1, sometim es factor analysis is used not to

  identify

an d

  interpret factors,

  hut to

  obtain

  a

  more parsimonious

  set of

  composite

scores (i.e., factor scores)

  that  are  then

  used

  in

  subsequent analyses (e.g.,

regression) instead

  of the

  measured variable scores. There

  are

  four  methods

of  obtaining factor scores (i.e., composite variable scores for each person

on

  each

  factor)

  that

  are briefly  summarized here.

 Chapter

  4  includes more

detailed

  comparisons  of  these  methods.

The

  most commonly used method

  of

  obtaining factor scores

  (F^

 x

 y)

is  called the  regression

  method.

  As a  first  step, the measured variables are

each converted   into  z  scores with means  of  zero  and SDs of  1.0.

  Then

  the

following

  algorithm

  is

  applied:

F N x F

  =

  Z N

  x

  v R V x v P V x

  F -

The   rightmost portion  of the  formula  can be  reexpressed  as :

W v x F

 

R-VxV PvxF-

This  is the  "factor score matrix output"  by

 SPSS.

Second, Bartlett (1 937 ) proposed m ethods for estim ating factor scores

intended

  to

  minimize

 the

  influence

 of the

  un iqu e factors consisting

  of

 single

measured

  variables  not  usu ally extracted  in the  analysis. Third,  Anderson

and R ub in (1956) proposed methods sim ilar to Bartlett 's methods, b ut w ith

an  added constraint  that  th e  resu lting factor scores m ust  be  orthogonal.

All of

  these  factor score estimation procedures yield factor scores that

are themselves z  scores,

 if the

  extraction method

  is

 principal components,

  or

the

  estimation  method

 is

 Anderson—R ubin. This does

 n ot

 create

  difficulties if

44

  EXPLORATORY A N D CON FIRMATO RY FACTOR ANALYSIS

Page 47: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 47/185

Page 48: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 48/185

TABLE 3.9

z Scores and  Standardized Noncentered Scores for the

Four

 Measured Variables

z

 scores

ID

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

Mean

SD

ZPER1

.48

-1.08

-.56

-1.08

-1.08

-.04

.48

-.04

-3.14

.99

.99

-1.59

-.56

-.04

.99

.99

.48

.48

.99

.48

.99

-.04

-.04

.48

.48

0.00

1.00

ZPER2

-.02

-.02

-1.11

-1.11

-1.11

-.02

.52

-.02

-2.20

1.07

1.07

-2.20

-.57

-.02

1.07

1.07

.52

-1.11

1.07

-.02

1.07

-.02

.52

1.07

.52

0.00

1.00

ZPER5

-1.33

-.75

-.18

-.75

-.18

.96

.39

-.18

-2.47

-.18

.96

.39

.96

-1.90

.96

-.18

.96

-.75

.39

-.18

2.10

.96

-.18

-.18

.39

0.00

1.00

ZPER6

-1.59

-.19

-1.12

-.65

-.65

1.21

.75

-1.12

-2.06

.75

1.68

-.19

.75

-1.12

-.19

-.19

.28

1.21

.75

.28

1.68

.28

.28

-1.12

.28

0.00

1.00

Standardized

CPER1

7.56

6.00

6.52

6.00

6.00

7.04

7.56

7.04

3.94

8.07

8.07

5.49

6.52

7.04

8.07

8.07

7.56

7.56

8.07

7.56

8.07

7.04

7.04

7.56

7.56

7.08

1.00

CPER2

7.02

7.02

5.93

5.93

5.93

7.02

7.56

7.02

4.84

8.11

8.11

4.84

6.47

7.02

8.11

8.11

7.56

5.93

8.11

7.02

8.11

7.02

7.56

8.11

7.56

7.04

1.00

noncentered

CPER5

3.99

4.57

5.14

4.57

5.14

6.28

5.71

5.14

2.85

5.14

6.28

5.71

6.28

3.42

6.28

5.14

6.28

4.57

5.71

5.14

7.42

6.28

5.14

5.14

5.71

5.32

1.00

CPER6

3.81

5.21

4.28

4.75

4.75

6.61

6.15

4.28

3.34

6.15

7.08

5.21

6.15

4.28

5.21

5.21

5.68

6.61

6.15

5.68

7.08

5.68

5.68

4.28

5.68

5.40

1.00

  ote

Th e

  means

 of the

 standardized noncentered

 scores  are the

 same

  as the

 means

 of

 these

 variables

in their

  original

 form, a s

 reported

 in Table

 3 2

descriptives variables=cperl

 to

 cper6

 .

compute btscrl= .55080 * cperl) +  .62861 * cper2)

+

  -.17846

 * cperB) +

  -.18121

 *

 cper6)

 .

compute btscr2= -.11583 * cperl) +

  -.22316

 * cper2)

+

  .60668 * cperS) +  .61055 * cper6) .

print formats fscrl fscr2 btscrl btscr2 F8.3) .

Table

 3.10

 presents

 both

  th e  regular

  regression

  factor  scores  and the

standardized

  noncentered

 factor

  scores

  for

 these  data.

  Both  sets of  factor

scores

  are

 compared

 here

 only

  fo r

 heuristic purposes.

  In

  ac tua l  research

  th e

score  es t imation  method  would  be  selected  on the  basis  of the  research

questions and the related preferences for

  different

  factor

  score

  properties.

EXPLORATORY

  AND CONFJRMATORY  FACTOR  ANALYSIS

Page 49: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 49/185

TABLE

 3 10

Conventional Regression Factor Scores

  F

2 5 X

2 )

  and  Standardized

Noncentered Factor

 Scores for the EFA of the Four Measured  Variables

Regression Standardized  noncentered

ID

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

Mean

SO

FSCR1

.773

-.437

-.770

-1.037

-1.139

-.428

.386

.199

-2.300

1.115

.741

-2.295

-.970

.505

1.080

1.284

.368

-.522

1.013

.230

.537

-.259

.288

1.169

.470

0.000

1.000

FSCR2

-1 .824

-.442

-.483

-.485

-.138

1.334

.520

-.786

-1

 .898

-.008

1.256

.797

1.230

-1 .826

.115

-.578

.582

.476

.339

.010

1.950

.763

-.052

-1

 .089

.235

0.000

1.000

BTSCR1

7.170

5.960

5.627

5.360

5.258

5.969

6.783

6.596

4.097

7.512

7.138

4.102

5.427

6.903

7.477

7.681

6.765

5.875

7.410

6.627

6.934

6.138

6.685

7.566

6.867

6.397

1.000

BTSCR2

2.309

3.691

3.650

3.649

3.996

5.467

4.654

3.348

2.235

4.126

5.390

4.930

5.363

2.307

4.249

3.555

4.715

4.610

4.472

4.143

6.083

4.897

4.082

3.045

4.369

4.133

1.000

T he

  standardized  noncentered  factor  scores

  are  perfectly

 correlated

w i t h their  regular regression factor  score counterparts  (i.e.,  r^scRi  x B T S C R I

  =

1.0 and

  f p s c R Z

  x

 R T S C R Z

  =

  1-0)i

 anc

l

  the

  correlations across

  the two

  factor

scores  also match across  the two  score  types  (i.e.,  T F S C R I  x F S C R 2

  =

  r

B T S C R i x

R T S C R 2 *

 which here

  is

 0.0).

 For the

  Tahle

 3.10

 results,

  a

  comparison

  of the

means on the

  s tandardized

  noncentered

 factor  scores  suggests

  that  the 25

participants on the

  average  rated their

  l ibraries

  higher

  on

  Factor

  I

  (6 .397)

than  on

  Factor

  II

  ( 4 -133) .

M A J O R

  CONCEPTS

The  sequence of EFA involves num erous decisions (i.e., what associa-

tion

 matrix

 to

 analyze,

 h ow ma n y

 factors

 to

 extract ,

 factor

 extraction method,

DECISION SEQUENCE

47

Page 50: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 50/185

factor  rotation  method, factor score estimation method). For each of the

steps

  of

  analysis,  there

  are

  myriad possihle decisions

  or

  ways

  of

  informing

judgment .  Thus,  across  all the  steps  and all the  choices  fo r

 each

  step,  th e

possihle number

  of

  com hinations

  of

  analytic

  choices  is

 huge

It is

  also important

  not to

  l imit

  the

  analysis

  at any

  stage

  to a

  single

analytic

  choice.

  For  example,  in  deciding  how  many factors  to  extract,

researchers do well to consult several procedures (e.g., scree plot, parallel

analysis)  and

  then

  evaluate

  whether

  various strategies corroborate

  each

other. The  most confidence  can be  vested  in solutions  fo r w hich factors  are

stable

  or

  invariant over various analytic

  choices.

The

  purpose

 of EFA is to

  isolate factors that have simple structure

  or ,

in

 other words, are interpre table . Any thoug htful ana lytic choices  that yield

clear factors  ar e  justif ied,  including analyzing th e  data  in  lots  of  different

ways

  to  "let  th e  data speak."

In  practice, factor rotation  is  necessary.  The  only exception  is  when

a

  single factor

  is

 extracted,

  in

  which case

  no

  rotation

  is

 possible.

  Nor in

such a case is

 rotation

 needed, b ecause presum ably every variab le will saturate

the single factor, and the factor is w hatev er explains all the v ariable s bein g

highly

  associated with

  each other.

  Usually when m u ltip le factors

  are ex-

tracted, reasonable simple structure

 is

 realized  with varimax rotation.

  A nd

i f

  these

  factors

  ar e

  interpretable,

  th e

  correct rotation method

  has

  typically

been  identif ied.  However, even when

  an

  orthogonal rotation yield s simple

structure,

  when

  th e

  purpose

  of the

  analysis

 is to

  obtain factor scores

  for use

in subsequent statistical analy ses, if the researcher be liev es

 tha t

  the latent

constructs

  have

 notew orthy correlations,

  an

  oblique rotation (e.g., promax)

may be

 sui table.

4 8  E XPL OR AT OR Y  A N D  C O N F I R M A T O R Y  FACT OR A N A L Y S / S

Page 51: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 51/185

4

COMPONENTS VERSUS

FACTORS CONTROVERSIES

In  the  previous chapter  it was noted  that  there  are  myriad  methods

wi th

  which

  to

  extract factors, some

  of

  wh ich

  are

  avai lab le

  in

  commonly

used   statistics packages.

  It is

 impor tant

  to

  understand

  th e

  extent

  to

  which

extract ion choices

 m ay

 affect  factor interpretations, and, especially,

 to

 under-

stand why and when differences  m ay arise across method s  for a given data set.

For reasons that

 wil l b e

 explained mom entari ly ,

 i t

 m ight

  be

 argued that

wh e n

  th e

  n u m b e r

 o f

 var iables

 is [at

  least] moderately large,

 fo r

 example,

greater than

 30, and the

 analysis contains

  virtually no

 variab les expected

to have low com m unalit ie s (e .g. , .4) , practically any of the exploratory

[extraction] proced ures . . .

  wil l

  lead to the same interpretations.

(Gorsuch,

  1983,

  p.

  123)

Similarly ,  Cliff  (1987) suggested that "the choice  of  common  factors  or

components methods often makes

  vir tual ly

 no

  difference

  to the  conclusions

of

  a

  s tudy"

  (p.

  349).  These  d y namics will

  be

  i l lustrated here with data

  for

different  variables

 and

  participants  from

  the

  Appendix

  A

  scores.

Table 4.1 compares

 three

 of the  more elegant extraction methods that,

unlike  principal components  analysis,  each invokes iteration

  as

 par t

  of the

extract ion process: alpha, m ax im um l ikel ihood, and  image factor  analysis.

As an

  inspection

  of the ta ble d results suggests, for these d ata the three

extraction methods  yield  reasonably comparable factor structures.

49

Page 52: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 52/185

TABLE 4.1

Varimax-Rotated Pattern/Structure  Coefficient  Matrices

  fo r

 Three  Factor

Extraction Methods

  n =

 100 Faculty)

Variable

PER1

PER2

PER3

PER4

PER5

PER6

PER?

PER8

PER9

PER10

PER11

I

.20

.12

.23

.16

.89

.87

.92

.72

.34

.21

.17

Alpha

II

.90

.82

.62

.86

.13

.14

.18

.31

.18

.23

.34

III

.18

.24

.22

.26

.22

.22

.20

.18

.44

.72

.56

I

.20

.11

.26

.19

.90

.87

.92

.72

.39

.20

.23

ML

II

.91

.86

.64

.84

.13

.15

.20

.32

.22

.23

.37

III

.16

.19

.12

.20

.18

.19

.09

.16

.32

.95

.40

I

.21

.12

.24

.18

.86

.84

.86

.71

.37

.25

.21

Image

II

.84

.82

.61

.80

.14

.16

.19

.32

.21

.31

.36

III

.19

.20

.19

.24

.20

.20

.18

.16

.35

.45

.42

Note.

  Pattern/structure coefficients  greater than  |.4| are italicized. ML = maximum  likelihood.

The  largest difference occurs  on the  third factor, which involves only

2  or 3 of the 11

  measured variables.

 And the

  results from

  the

  image factor

analysis  differ  somewhat more from  the

  first

  tw o  sets  of  results than  these

tw o

  sets

  differ

  from each other.

  This  is

 partly because im age factor analysis

focuses on m inim izing the influence of factors involving single variables not

reflected

  via

  covariations

  in the

  "images"

  of the

  other

  measured variables.

However,

  the

  d ifferences across extraction m ethods  tend

  to

  decrease

as there are  more measured variables.  For  example, regarding image analysis,

as

  more measured variables are  added

  there

 is an  increasing likelihood

  that

most measured variables will be correlated with  another  measured variable

or a  combination  of the  other  measured variables,  so the  influences  of

"specific"

 factors consisting

  of a

 single measured variable,

 and not

  represent-

ing  true latent variables,

 will

  tend to become less as there are more m easured

variables.

  Bear

  in

  mind that

  th e

  present small heuristic analysis involving

only

  11

  measured variables

 is

 unrea listic . Researchers would  rarely

  (i f

 ever)

factor analyze

  so few

  measured variables.

MEASUREMENT  ERROR

  VERSUS

  SAMPL ING ERROR

As

  noted

  in  chapter  3,  principal axes factor analysis  is the  same  as

principal  components  analy sis, except  that  in principal axes factoring

  only

the

  diagonal

  of the

  correlation matrix

  is

  modified

  by

  removing

  the

  1.0's

originally alway s residing  there  and iteratively estimating the true commu-

50   EXPLORATORY  A ND  CONFIRMATORY

  FACTOR

  ANALYSIS

Page 53: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 53/185

TABLE 4 2

Communality Coefficients  If )  Through Four  Iterations

 n -  100 Graduate Students)

Iteration

  a r i a b l e

PER1

PER2

PER3

PER4

PER5

PER6

PER7

PER8

 

83714

 74596

 67464

 65751

 75740

 55652

 73583

 65091

 

89374

 76327

 69098

 67103

 79961

 58434

 77671

 67977

2

 92230

 76468

 69167

 67066

 81378

 58885

 78818

 68493

3

 93880

 76277

 68996

 66857

 81939

 58870

 79155

 68521

4

 94904

 76061

 68843

 66692

 82197

 58798

 79257

 68479

nality   coefficients.  This  is

 done

  to take  into

 consideration

 the  fact  that all

scores have  some measurement error (i.e., scores  are never perfectly relia ble).

Table  4-2

  illustrates

  th e  iteration

  process

  for a

  principal axes factor

analysis  involving  th e  first  eight measured variables presented  in  Appendix

A, and

  l imi t ing

 the

  analysis

  to the 100

 graduate students

  (i.e.,  the first 100

scores  in Appendix  A). In this particular analysis, before iteration  w as begun,

R

  values were substituted  on the  diagonal  of the  correlation matrix.

  These

results

  involve

  the R  between  a  given measured variable  and the  seven

remaining measured variables  in the  analysis.

For

  example,

  as

  reported

  in

  Table

  4.2,

  the

  R

2

  fo r

  predicting

  the 100

scores

  on the

  measured variable, PERI, with

  the

  scores

  on the  other

  seven

measured variables  in  this analysis (i.e., PER2, P ER 3,.  . . PER8)  w as .83714.

The R for

 predicting

  the 100

 scores

  on the

  measured variable, PER2, with

th e

  scores

  on the other

 seven  measured variables

  in

  this analysis (i.e., PERI,

PER3,  . . . PER8)  was  .74596.

Using the  R  value s in this man ner is sensible, although other  ini t ia l

estimates  of  communal i ty  m ay  also  be  plausible, because  the  reliabilities of

th e

  scores

  do  impact  th e

  multiple regression  effect  size  (Thompson, 2003).

If

  a particular variable was perfectly unreliable (i.e., measured

  nothing,

or

  only

  random

  fluctuations

  in

  scores),

  then

  the R"

  would theoretically

be  zero.

Table  4.2  then

  presents

  the /T

  est imates

  fo r

  each

  of  four

  successive

steps

 of

 iteration. These

 can be

 obtained

 in

 SPSS

 by

 invoking

 the

 CRITERIA

subcommand:

factor variables=perl to per 2/

analysis=perl

 to

 per8/

criteria=factors(2)

 iterate(1)

COMPONENTS

  VERSUS

  FACTORS CONTROVERSIES  51

Page 54: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 54/185

Page 55: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 55/185

PRINCIPAL

  COMPONENTS VERSUS

  PRINCIPAL AXES

  FACTORS

By

  far the two

  most commonly used estimation methods

  are

 principal

components  an d principal axes factors analyses. Principal

 com ponents

 analy-

sis

  is the

  default

  method

  in

  commonly used statistical packages,

  and is

 used

with

  considerable freq uenc y whe never exploratory factor analyses (EFA s)

are perform ed (Fabrig ar et a l., 1999;  Russell,  2002). The

  only

  difference

between these  tw o  methods  is that  1.0's  are  used  on the  d iagonal  of the

correlation matrix

 in

 principal components analysis.

 In

 prin cipal axes factor-

ing,

  th e

  communali ty est imates

  on the

  diagonal

  of the

  correlation matrix

are iterativ ely estimated un til the iterations converge (i.e., rema in

  fairly

stable

  across a contiguous pair of estimates).

It would be a major understatement to say

  that

  "analysts  differ  quite

heatedly over the  uti l i ty  of princip al components as compared to comm on

or

  principal factor analysis"

 (Thompson

  &

  Daniel, 1996,

  p.

  201).

  As

 Cliff

(1987) noted,

Some  author i t ies

  insist

  that  componen t  analysis is the only suitable

approach,

  and  that  the  c o m m o n

 factors  methods just superimpose

  a lot

of ext raneous m umbo jum bo, dea l ing w i th fundamen ta l l y  immeasurable

things,  th e  c o m m o n

  factors. Feelings are,

  if

 anything,

  even

  stronger

  on

th e  o ther

  side.

  . . .

 Some  even insist that

  th e

  term "factor analysis" must

not

  even

  be

  used

  when  a

  components  analysis

 i s

  performed,

  (p .

  349)

Indeed,

  an

  entire special

  issue  on

  this controversy

  was

  published

  in

the  j ou rna l ,

 M ult ivariate

  Behavioral

  Research

 (Mulaik ,

  1992).  And  Gorsuch

(2003) recently  noted  that  "in  following  th e  dialogue for the  past  30  years,

th e  only  major  change seems  to be  that  th e  heat  of the  debate  has in-

creased" (p. 155).

A  series of com parisons  of princ ipal com ponents versus principal axes

factors  for the App endix A d ata m akes this discussion concrete. Table 4.3

presents  a  principal com ponents

  versus

  a  prin cip al axes factor analysis of

TABLE

 4.3

Components

 Versus

 Principal

 Axes

  Factors for

  our

 Measured

 Variables

 n=  100 Graduate Students)

Varimax-rotated

 components

Variable

PER1

PER2

PER5

PER6

 

.94

.94

.28

.13

II

.19

.22

.8 6

.92

If

92.3%

92.7%

82.1%

85.3%

Varimax-rotated factors

 

.8 8

.90

.27

.16

II

.22

.25

.7 9

.7 8

tf

83.0%

87.2%

70.4%

62.9%

Note.

  Pattern/structure  coefficients

  greater

 than  |.4|

  are

  italicized. Given

  tha

 small number

 of

  variables,

  the

convergence  criterion  fo r principal axes  factor extraction  was raised from  the  default  of  .001  to  .003.

COMPONENTS   VERSUS  FACTORS CONTROVERSIES  53

Page 56: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 56/185

TABLE

  4 .4

Varimax-Rotated Components Versus Factors  fo r Eight  Measured

Variables With Factors  Estimated with  Ft

2

 Starting Values

  and 1.0 Starting

Values  n =

 100

 Graduate  Students)

Components

  Principal-axes

 factors

3

  Principal-axes

  factors

Variable

PER1

PER2

PER3

PER4

PER5

PER6

PER7

PER8

I

.94

.8 8

.8 2

.8 2

.27

.15

.33

.27

II

.19

.23

.32

.29

.8 8

.8 4

.8 5

.8 3

tf

91

 .5%

82.8%

77.2%

76.1%

84.0%

73.0%

82.6%

76.7%

I

.96333

.82979

.75473

. 74967

.26440

.18993

.32802

.28089

II

.18636

.26108

.34148

.32049

.86862

.74214

.82784

. 77795

tf

96.3%

75.7%

68.6%

66.5%

82.4%

58.7%

79.3%

68.4%

I

.96362

.82971

.75466

.74959

.26447

.18985

.32802

.28089

II

.18625

.26112

.34151

.32054

.86841

. 74235

.82782

.77796

tf

96.3%

75.7%

68.6%

66.5%

82.4%

58.7%

79.3%

68.4%

Note.  Pattern/structure coefficients greater  than  |.4|

 are

  italicized.

 Communality

 estimation

 was

 begun using

  fl

2

  starting  values.

 Com munality e stimation

  was begun

 using

  1. 0

  starting values.

four  m easured variables. Altho ugh the p atterns of the  results  (two m easured

variables

  saturate each

  of the two

  latent

  variables)  are

  similar,

  tw o  differ-

ences appear.

First,

 th e

  pattern/structure coefficients  tend

  to be

 smaller

 for the

 pr inci-

pal  axes factors. Second,

  th e

  co mmu na l i t y

 coefficients

  tend

  to be

  smaller

for

  the  principal axes factors. These tw o outcomes are a function  of changing

only

  the diagonal of the correlation matrix  from  1.0s to sm aller positive

numbers  w i t h in the  prin cipa l axes analyses.

Table  4-4 presents a p rincipal com ponents

  analysis

 versus tw o principal

axes factor  analyses of eight m easured variables. W ith regard to the pr incipal

axes  factor  analyses,

  the first

  analysis  again began with  h  est imation using

R

  values  of  each measured  variable  with  the  rem aining seven m easured

variables, and estim ation continued until convergence  was achieved  in seven

iterations.

  The  second prin cipa l axes factor

  analysis

  began

  h

  est imation

using

  1.0 values  (i.e., the  unaltered correlation matrix was

 used

  as a starting

poi nt), and estim ation continued un til convergence w as achieved in eight

iterations.  Clearly

  the two

  sets

  of

  results

  are

  comparable, even

  fo r

  this

example w ith only eight m easured variables, which  is why  these results had

to be

  reported

  to

  more than

  tw o

  decim al values.

This is the general pattern in regard to selecting starting points for

iteration. Theoretically, roughly

 th e

  same values

 will  be

  realized regardless

of

 the

  initial estim ates. H owever, using

 less

 reasonable  initial estim ates (i.e.,

1.0's  vs.  R  or  rel iabil i ty

  coefficients) will

  require m ore iterations.  On a

modern microcomputer ,

  th e

  add it ional computat ions

  from

  more iteration

will

  usually

  go  unnoticed.

54

EXPLORATORY AND

  CONFIRMATORY FACTOR

  ANALYSIS

Page 57: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 57/185

The interpretations of the Ta ble 4-4 com ponents versus factors wou ld

be the  same, even though  the  h  values tend  to be  larger for the  component

analysis.

  And the two sets of results are more com parable for the Table 4-4

com parisons involv ing eight m easured variables  than  for the Table 4-3

comparisons involving

  four

  measured variables. Why?

Two  factors  affect  the  equivalence  of the  principal  components  an d

the princ ipal axes factors for a given d ata set. First, if the relia biliti es of the

scores being

  analyzed

  are quite high

  (i.e.,

  approach 1.0), then  th e

  h

  values

will  tend  to  each  be  roughly 1.0, which  is the  same

  h

val ue used  for  each

variable in the principal components analysis. This m ight occur, for exam ple,

i

the researcher was analy zing m easured variables that  were each  scores on

tests each consisting of a large number of items.

Second, there  is also a less obvious dyn am ic at work . The  only  difference

between princip al com ponents and princip al axes factor analyses involves

changing

  the

  diagonal

 of the

  correlation matrix when principal axes factor

analyses  are conducted. But the impact of altering only the diagonal of the

correlation matrix

  differs

  depending on how many measured variables are

being analyzed

As

  Snook

  and

  Gorsuch (1989,

  p.

  149) explained,

  "As the

  number

 of

variables  decreases, the ratio of diagonal to

  off-diagonal

  elements also de-

creases, and therefore the value of the communality has an increasing  effect

on the analysis." For example, with   four  measured variables, there are 16

(4 X 4) entries in the correlation m atr ix, of which

  four

  are on the diagonal

(4 /16 = 25 .0%) .  When there  are eight m easured v ariables , there are 64

(8 X 8) entr ies in the c orre lation m atr ix, of wh ich eight are on the diagon al

( 8 / 6 4  =  12.5%).

When  there  are 100  measured variables, there  are  10,000  entries  in

the  correlation matrix,  of  which  100 are on the  diagonal (100  /  10,000  =

1.0%).  And it is  more common  to  factor analyze  10 0  than  4  measured

variables As  there  ar e  m ore m easured variables, even  it  their  reliabilities

are modest, principal  components

  and

  principal axes factors tend

  to be

increasingly   similar (Ogasawara, 2000).  And  principal  components  have

some add itional desirable properties, as dem onstrated in chapter 5,  that

makes

  these

  analyses attractive,

  at

  least

  to

  some researchers.

MAJOR  CONCEPTS

There

 are

 numerous

 EFA factor extraction methods

 (e.g.,

 alpha, canon-

ical,

  image,  m a xi m u m  likelihood).  However,

  most  f requent ly

  seen  in  pub-

lished

 research are principal components and principal axes methods. Princi-

pal

  components

 is the

  default

  method  in

  widely

  used

  statistical packages.

COMPONENTS  VERSUS  FACTORS CONTR OVERSIES 55

Page 58: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 58/185

Principal  axes m ethods acknowledge measurem ent error

  by

 iteratively

approximating

  the

 com m unali ty estimates

 on the

  diagonal

 o f the

  correlation

matrix.

 When

 score re liabilit y is high for the m easured variab les, there will

be

  little difference between

  th e

  factors extracted using these

  tw o

  methods.

In any case, the differences in the two methods

  will

  also be smaller

as  the number of factored variables increases.

  This

  is because iterative

estimation of comm onalities changes

 only

  the  diagonal entries  of the  correla-

tion

  matr ix ,

  and the

  diagonal

  of the

  matr ix

  is

 progressively

 a

  smaller con-

stituent of the correlation matrix as more measured variables are analyzed

(e.g.,

  4 diagonal entrie s w ith 4 m easured variables is 4 out of 16 [25.0% ]

matrix  entries;

  8

  diagonal entries with

  8

  measured variables

  is 8 out of 64

[12.5%] ma trix entries). And in practice we usually factor

 fairly

  large num bers

of  measured variables.

There  is also a  trade-off  in considering  measurement error variances.

Each sample data

  se t

  also contains sampling error variance.

 The

  more steps

we  iterate  to  take  into  account measurement error,  th e  more statistical

estimates we are making, and the more opportunity we are creating for

sampling

  error to influence our results.

56

  EXPLORATORY  AND  CONFIRMATORY

  FACTOR

  ANALYSIS

Page 59: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 59/185

5

COMPARISONS  OF SOME

EXTRACTION, ROTATION,

AND  SCORE METHODS

The exploratory factor analysis (EF A) statistics called structure  coeffi-

cients were

 first

 introduced

  in

 chapter

  2, and the

  statistics called communal-

ity

 coefficients  were

 first

 introduced

  in

 chapter

 3 .

 Both

 of

 these statistics merit

further  discussion.

 This

 chapter  further elaborates

  the

  concepts und er ly ing

s tructure

  and  communali ty  (h )  coefficients hy  invoking factor score statis-

tics.

  First,

  however, some comparisons of factor score estimation methods

may be  helpful.

FACTOR SCORES  IN  DIFFERENT  EXTRACTION METHODS

As

 noted

 in

 chapter

 4, the two

  most commonly used

 EFA

  approaches

are principal components  and principal axes factor analyses. And, as noted

in

 chapter

  3,

 there

  are

 four  different  methods with which

  to

  estimate

  factor

scores.

 Here three

  factor score methods (regression, Bartlett,

 and

 Anderson-

Rubin)

  are

  compared

  to

  faci l i ta te deeper understanding

  of the

  factor score

estimation choices

  and to

  elaborate addit ional  differences

  in the two

  most

common

  extraction methods.

In real

 research,

  factor  scores are  typically only  estimated

 when

  the

researcher elects to use these scores in further substantive analyses (e.g., a

57

Page 60: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 60/185

mul t iva r i a t e analysis of var iance comparing m ean

  differences

  on three factors

across  men and  women) .  In these  applications  th e  researcher will use  only

one

  factor score estimation  method.  Here  several methods

  are

  compared

s imultaneously  only

  for

  heuristic purposes.

Factor scores

 can he

 derived

 hy

 invoking

 SPSS

 COMPUTE

  sta tements

that  apply the factor score coefficient matrix (Wy

  x

 F) to the measured

variahles in either z

  score

  or

  standardized, noncentered  form. This

 has the

heuristic value

  of

  making this compu tation process  concrete

  and

  explic i t .

However ,  in  actual research  it is

 both

  easier  and  more accurate  to  obtain

the  factor scores  by  using  the

  SPSS

  SAVE  subcommand within  the

FACTOR  procedure.

For ex am ple , to obtain regression factor scores

 from

  a principal compo-

nents  analysis involving

  11

 measured var iables (PERI through P ER U ),

  th e

syntax would

  be:

factor variables=perl to per 2/

analysis=perl

 to

 perll/criteria=factors 3)/

extraction=pc/rotation=varimax/

print=all/save=reg all

  reg_pc)/ .

The execution of this commands  yields three  factor scores for each  partici-

pant f rom a pr incipal

  components

  analysis that are labeled REG _PC1,

REG_PC2,  and

 REG_PC3.

Factor scores from  a  principal axes factor analysis estimated using  th e

Bart le t t  method  would  be  produced with  the  syntax:

factor variables=perl

 to per 2/

analysis=perl  to perll/criteria=factors 3)/

extraction=paf/rotation=varimax/

print=all/save=bart all bar_paf)/ .

Factor scores were computed for all 100

 faculty

  using the first 11 measured

variables presented

  in

 Appendix

  A fo r all six

 combinations

  of two

 extraction

methods  (principal components vs.  principal axes factor analysis) and three

factor score estim ation m ethods (regression, Ba rtlett ,

 an d

  Anderson—Rubin) .

Table

  5.1  presents  th e

  three

  sets  of principal  components  factor scores  fo r

the first 5 and

  last

 5

 participants

  in the

  data set. Notice that  when  principal

components  methods a re used,  all three factor score  methods

 yield

  identical factor

scores

  for a  given participant (e.g.,  fo r  participant 101,  REG_PC1  =

BAR_PC1  = AR_PC1  = +.35) .

Table 5.2 presents  the  three sets of principal axes analysis factor scores

for  the first 5 and  last  5  participants  in the  data set.  Notice  that  when

58  EXPLORATORY AND  CONFIRMATORY FACTOR ANALYSIS

Page 61: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 61/185

C O

_0)

.CO

C O

 Q

Q)

C O

C O

>

£

 o

C O

Q)

C O

lo

 ^

£ 2

o

Q.

£

,_

  o

  c _ )

^ * j

UJ CO

DO

  5

1—   £

C D

c S

X

C O

rrt

 

,0

C O

o

o

C O

£ 2o

0

C O

LL

0)

O

0)

CD

CO

£•

O

C O

II

c

'

- ^

1

CL

LU

  T T ^

^

O

jE

r:

li

t

C

 Q

 

D C

O

CO

C D

T3

C

5?

CO

C O

c

o

'co

CO

Q)

o>

0>

Q C

CO

o

Q.

 

DC

O

CL

DC

O

CL

1

D C

CO

O

C L

oj

C O

(M

o

Q.

D C

m

o

i

D C

C O

CO

o

CL

j

O

LU

DC

OJ

O

Q.

C D

LU

D C

T—

O

o _

d

LU

DC

Q

C O

LO

 

in

CO

00

in

f

in

CO

00

in

 

CO

o

§

in

CO

 

00

 

s

in

CO

r

00

in

i

o

in

CO

 

CO

m

l

o

h- oo

CD   t-

 

OJ CO

OO

  CO

 

•*

 CD

|

C- CO

CD i-

 

CM CO

C» CO

•^- ^—

'

•* CO

 

I

s

- OO

CD i-

1

CM CO

CO CO

1

52

i

CO

  •t

0O

CO

CM

CO

O)

li

 

CO

CM

CO

O5

^

-.

l

CO

CM

CO

0>

£

1

§

CO

^

03

in

l

CO

in

in

CO

••-

00

in

T—

1

CO

m

in

CO

*-

00

in

 

CO

m

. m

.

 O

00 O>

CO C*-

T — 1

 

^c^

 

O

 CO

i-

O>

00 O)

CO   f~

T 1

 

s ^ -

1—   ^

1

O

 CO

- CT>

00

  O)

CO   ("-

^ - 1

1

•*

 CM

 

O

 CO

T O

CD   r~ -

CT> O)

5 >

 

CO

CM

 

00

 

5

 

CD

CM

i—

 

^

O

 

l

CD

OJ

1

r*-

00

l

 

CD CO

m oo

T

 |

in h-

• < f r   m

CM

 i-

i  i

r-- co

CO   S-

T

_ _

  ^

l

CO CO

in oo

T

 |

in r~-

^T

  i

h- 00

C O

  h-

••- -r-

CO CO

m oo

r- \

m r —

-I-  in

CM   t-

l i

K

 CO

CO   K

*

T

en

 o

en

 o

T- CM

COMPARISONS  OF METHODS

59

Page 62: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 62/185

Page 63: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 63/185

princ ipal axes factor analyses method s

  are

 used,

 all

 three factor score methods

yield

 different

  fac tor scores

 for a

 given partic ipan t (e.g. ,

 fo r

 participan t 102,

RG_PAF1  =  -.76; BR_ PAF1  = -.86;  ARJPAF1 =

 -.83).

Table  5.3 presents  th e  Pearson correlation coefficients among  all pair-

wise

  combinat ions  of the 18  factor scores  for the 100  participants  in  this

analysis.

 Recal l

 that all six

 analyses invoked rotation

 to the

 varimax cri ter ion,

which

  means  that  the  three

  factors

  in all six

  analyses were

  perfectly

uncorre lated .

As the

  correlation

  coefficients  in Table  5.3

  indicate,

  the

  correlations

of th e

  factor scores from

 th e

  principal components  analyses exactly matched

th e

  correlations (here

  all

  zero)

  of the

  factors themselves.

  This

  always  (and

only always) happens

 in

 prin cipa l components analysis.

 The

  exact matching

of

  factor score correlations with factor correlations is

  another

  appealing

feature

  of principal components  analyses.

STRUCTU RE COEFFICIENTS IN A  COMPONENTS CONTEXT

As

 noted

  in  chapter  2,  correlations between measured variables and

composite variables

  are

  always called  structure  coefficients  (r )  throughou t

the  general l inear model (GLM; Courville  Thompson,  2001).  Table 5.4

presents

  th e

  Pearson

  correlation

  coefficients

  for the 11

  measured variables

with

  the

  three  factor scores

  from  the

  principal components analysis.

 These

were

  obtained

  wi th

  the

  SPSS

  syntax:

correlations variables=perl

 to

 peril with

reg_pcl  to reg_pc3 .

Note  that  th e  Table  5.4 structure coefficients

  literally

  computed as

bivar ia te  correlation

  coefficients

  exact ly match

  the

  pat tern/s t ructure  coeffi-

cients produced by the  SPSS  principal  components  analysis of the  same 11

measured variables.

 These

  varima x-rota ted pat tern/s t ructure coefficients are

presented  in Table  5.5.

COMMUNALITY

  COEFFICIENTS  AS

  R

1

 VALUES

In   chapter  2 it was noted  that  communality coefficients indicate  th e

proportion of variance of a measured variable that  the factors as a set can

reproduce. Conversely,  h

  can be

  viewed

 as

 characterizing

  how

  much

  of the

variance of a measured variable was

  useful

  in delineating the extracted

factors.

COMPARISONS  OF

  METHODS  61

Page 64: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 64/185

TABLE

  5 .3

Factor

  Score C or r e la tions

  n =  100

  Faculty)

Principal components

Variable

REG.PC1

REG-PC2

REG-PC3

BAR_PC1

BAR_PC2

B A R _ PC3

AR-PC1

AR_PC2

AR-PC3

RG-PAF1

RG-PAF2

RG-PAF3

BR-PAF1

BR_PAF2

BR_PAF3

AR_PAF1

AR_PAF2

AR-PAF3

REG_PC1

1 0000

.0000

.0000

1

 0000

.0000

.0000

1 0000

.0000

.0000

.9903

.0054

.0485

.9904

.0061

-.0476

.9917

.0044

-.0023

Regression

REG_PC2

1 .0000

.0000

.0000

1 .0000

.0000

.0000

1 0000

.0000

-.0017

.9779

.1007

-.0064

.9748

-.0116

-.0050

.9793

.0383

REG-PC3

1 .0000

. o o o o l

.ooool

1  . O O O O l

. o o o o l

. o o o o l

1.  OOOOl

.0596

.0543

.95711

-.03141

-.0577]

.95011

.0208)

.00351

.95351

BAR_PC1

1.0000

.0000

.0000

1 .0000

.0000

.0000

.9903

.0054

.0485

.9904

.0061

-.0476

.9917

.0044

-.0023

Bartlett

BAR-.PC2

1 .0000

.0000

.0000

1 .0000

.0000

-.0017

.9779

.1007

-.0064

.9748

-.0116

-.0050

.9793

.0383

Anderson-Rubin

BAR_PC3

1

 .0000

. o o o o l

. o o o o l

1 .OOOOl

.0596

.0543

.95711

-.03141

-.05771

.95011

.02081

.00351

.95351

A R _ P C 1

1 .0000

.0000

.0000

.9903

.0054

.0485

.9904

.0061

-.0476

.9917

.0044

-.0023

AR_PC2

1

 .0000

.0000

-.0017

.9779

.1007

-.0064

.9748

-.0116

-.0050

.9793

.0383

AR_PC3

1.0000

.0596

.0543

.95711

-.03141

-.05771

.95011

.02081

.00351

.95351

One

  heur is t ic

 to

 mak e

 these

 concepts concrete is to

 conduct

 a

 m u l t i p le

regression analysis  in  which  the  measured variables  in  tu rn  are  predicted

by the  factor scores

 from

 the  pr inc ipa l components analysis.

 These

 regression

analyses  can be executed  using  the  SPSS  syntax:

regression variables=reg_pcl to reg_pc3

perl to perll/dependent=perl /

enter reg_pcl

  to

 reg_pc3

 .

regression variables=reg_pcl to reg_pc3

perl

  to

 perll/dependent=per2

  /

enter reg_pcl

  to

 reg_pc3

 .

regression variables=reg_pcl

 to

 reg_pc3

perl

  to

 perll/dependent=perll

 /

enter reg_pcl

  to

 reg_pc3

 .

6

EXPLORATORY

 AND C ONFIRM ATOR Y FACTOR  ANALYSIS

Page 65: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 65/185

Page 66: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 66/185

Page 67: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 67/185

Dependent Var iab le.

  .

  PER1

  1

 Wi ll in gne ss

  to

  help users

M ethod: Ente r REG _PC1 REG _PC2 REG_PC3

M ultiple

  R

  .93803

R  Square .87990

Variables

  in the

  Equation

Variable B SE B Beta T Sig T

REG-PC1  .277091  .048319 .202837 5.735  .0000

R EG_ P C 2

  1.224812 .048319 .896592 25.349 .0000

REG _PC 3 .255 119 .048319 .186753 5.280 .0000

(Con stan t) 7.650000 .048077 159.121 .0000

E n d

  Block Number

  1 Al l

  r eques ted var iab les en t e r ed .

De pen de n t Variable. . PER2 1 G iving use rs in dividual att

M ethod: Ente r REG _PC1 REG_PC2 REG_PC3

M ultiple

  R

  .90666

R

  Square .82204

Variables

  in the

  Equation

Variable

R E G PC 1

RE G

  PC 2

REG_PC3

(Constant)

B

.167676

1 .353442

.333508

7.310000

SE   B

.066672

.066672

.066672

.066338

Beta

.108283

.874029

.215374

T

2.515

20.300

5.002

110.194

S i g T

.0136

.0000

.0000

.0000

Figure  5.1.

  M ultip le re gres s ion predictin g

 pleasured

  variables using

 the

 three factor

s co r es

  n =

 100 faculty).

Factor scores  can  also  be  used  fo r  heuristic purposes,  to  better grasp

the nature of  both  structure coefficients and communal i ty

  coefficients.

Throughout

  th e  G L M ,  a  structure  coefficient,  score-world statistic,  can

be computed  (l i teral ly)  as the bivariate

 product-moment

  correlation of a

measured variable (e.g. ,  th e  scores  of n  people  on a  factored variable) with

a

  latent variable (e.g. ,

 th e

  principal components factor scores

  of n

  people).

And a

  com muna lity coefficient,

 an

  area-world statistic,

  in the

  orthogonal

principal  components  case

  can be

  computed

  as the R

2

 between

  th e

  factor

scores

  of n

  people with

  th e

  scores

  of n

  people

  on the

  measured variables.

COMPARISONS OF METHODS  65

Page 68: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 68/185

6

OBLIQUE  ROTATIONS

  AND

HIGHER-ORDER

  FACTORS

In chapter

 3 the

 notion

 of

 factor  rotation

 was

 introduced.

 It was

 noted

t h a t

 ro tat ion is a lways poss ib le in exploratory factor analys is (EFA) whe never

there

  is more

  than

  one  factor ,  and

  that  rotat ion

  is  almost always necessary

in  these  cases

  to

  obtain s imple s t ructure .

  O ne

  class

  of

  methods involves

or thogonal

  rotation

  and

  leaves

  the

  factors s til l perfectly uncorrelated.

  A

second class o f me thods involves obl ique ro tat ion  and  allows t he  unro ta ted

factors  ( that a re

 ini t ia l ly a lways uncorrelated)

  to

 become correlated. Ob lique

rotations present some interpretation challenges

  tha t

  mer i t fu r ther

discussion.

Table 5.5 (see chap.  5) presented  the v ar im ax- ro ta ted p r inc ipa l compo-

nents

 f rom

 an

 analysis

 o f the  first  11

 measured var iab les presented

  in Appen -

dix

  A for

  data provided

  by the 100

  facu l ty members .

  This

  solution

  had

reasonably s imple s t ructure .

  A few

  measured var iab les were

  no t

  q u i t e

  as

clear

  in

  regard

  to

  what factors they reflected.

  For

  example , PERU

  had a

pa ttern /stru ctu re c oeffic ient on F actor III of .72, but on Factor II had a

pat tern/s tructure coeff ic ient

  of

  .34.

How ever , i t is im por t ant to reme mb er that a l l cor relat ion coeff ic ients ,

including structure coefficients , must be squared before their magnitudes

can be

  compared. Thus, because

  .72 is

 51.8%,

  and .34 is

  11.6%,

  the

  larger

of  these  tw o  s tructure coeff ic ients  w as  actual ly  4 .5  times larger  than  the

smaller value.

67

Page 69: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 69/185

TABLE 6.1

Promax-Rotated Components for 11 Measured Variables  n  = 100 Faculty)

Pattern Structure

Variable

PER1

PER2

PER3

PER4

PER5

PER6

PER7

PER8

PER9

PER10

PER11

I

.04

-.07

.10

-.01

.95

.93

.94

.81

.20

-.03

-.10

II

.94

.92

.75

.90

-.06

-.05

.00

.20

-.12

.01

.17

III

-.03

.03

-.03

.05

.04

.04

.00

-.07

.70

.85

.75

I

.40

.31

.39

.37

.94

.93

.95

.85

.48

.38

.33

II

.94

.91

.78

.92

.34

.34

.38

.49

.31

.43

.51

III

.46

.46

.40

.50

.46

.45

.45

.41

.73

.84

.79

If

88.0%

82.2%

61 .5

84.7%

88.8%

86.9%

89.7%

75.5%

57.0%

71 .2%

64.7%

Note.

  Pattern coefficients greater

  than

  |.4| are italicized.

Although  the  v a r ima x - r o t a t e d  solution had  s imp le s t r uc tu r e ,  for  he u -

r is t ic purposes the solution was also rotated to the  p r o ma x

  criterion.

  Because

S

V x

p equa ls

  P

V X

H ( R F

X

H ) .

  a n

d

  R p x F

  n o w

  wi l l

  no t equa l

  I P

X

F .

  the s t r u c t u r e

and the

 pattern  coef f ic ient matr ices wi l l

 no longer be

 equa l . Indeed , hecause

the

  pa t te rn coef f ic ients

  are no

  longer cor re la t ion coef f ic ients , some,  m a n y ,

or all of the  pa t te rn coef f ic ients may be  less  than  -1 or  grea te r

  than

  +1.

Table  6.1  presents  the pattern and the  s t r uc tu r e ma t r i c e s  f rom  this

ana lys is .

 Table

 6.2

  presents

  the

  fac tor

  correlation

  ma t r ix f r om

  the

  ana lys is .

In

 chapter

 2 the

  notion

 of the

  c o mmu na l i t y c o e f f i c i e n t

 was first

 intro-

duced.  In  ac tua l i ty ,  the  examp le invo lved  a  v a r ima x - r o t a t e d p r i n c i p a l c o m-

ponents  so lu t ion.  In an

  orthogonal

  ro ta t ion  the

  h

2

  can be  c o mp u t e d  by

s u m m i n g  the  squared s t ruc ture coef f ic ients for a  var ia b le ac ross  the  factors.

This

  y ie lds

  the  total

  var iance

  of the

  measu r ed va r i ab le  common

  to the

fac tors

 as a set. The  computation is

 s implif ied

  because  the  fac tor s are  unco r r e -

la ted, and so

 there

 is zero over lap be tw een  the common var iances  of a  g iven

measu r ed va r i ab le  on any two  fac tor s .

B ut

  when  factors  are  cor re la ted ,  the  h  is  der ived  by first computing

the

  p roduc t s

  of a

  g iven measured var iab le ' s

  pattern

  coef f ic ient

  on

  each

TABLE 6.2

Factor Correlation Matrix

  (R

3x 3

)  for the

 Table

  6.1

  Factor Pattern Matrix

Factor  I II III

  1.000

  .398 1.000

III

  .473 .502 1.000

68  E X P L O R A T O R Y AND CO N F I R M A TO R Y F A CTO R  ANALYSIS

Page 70: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 70/185

factor with the  corresponding s truc ture coefficient  on each factor ,  and then

s u mming

  these

  products .

  For

  example ,

  for the

  measu red var iah le PERI

  in

the

  Table

  6.1 solution,  h for this measured var iahle is computed as :

[0.040

  x

 0.399]

  +

  [0.936

  x

 0.937]

  +

  [-0.029

  x

  0.460]

  =

0.016  +  0.877  + -0.013  =  87.96%.

It

  should  he

  noted that

  th e  algorithm  for a  given measured var iab le ,

ZP

K

S« across the  k  factors , is a more general

  formula

  fo r  h

2

. When  factors

are uncorrelated, £P

K

SK across the k factors can be reexpressed as ZP

K

2

,

because

  in

  this case

  the

  pat tern

  coefficients  are

  also s tructure

  coeff icients .

In

  o ther

  words ,  fo r  pr incipal components  and  pr incipal axes analyses  th e

algorithm ZP^Sx across  the fe  factors

  always

  works , whether  or not the

factors

  are

  correlated.

M A X I M I Z I NG S A M P LE

  F IT  VERSUS

 RE PLI CA BI LITY

In  th e  present comparison  the Tab le  5 .5 var im ax-ro tate d so lu t ion  and

th e  Table  6 .1

  promax-ro tated so lu t ion would general ly lead

  to the

  same

interpretat ions regarding

  the

  m a k e u p

  of the  three

  unde rlying constructs .

Where the  so lu t ions  d i f f e r   is in the  representat ions of the  correlat ions among

the

  latent  constructs .

The first  componen t  in

 Table

  5.5

 wou ld

  be

  interpreted

  as

  measur ing

th e

  Library

  as

  Place.

This component  w as

 saturated w ith percept ions

  of

l ibraries inc lud ing r espec t ive ly

 the

  i tems

  7 . A

 co n tem p la t i v e env i ro nmen t ,

5. A

  haven

  fo r

 qu ie t

  and

  solitude,

6. A

  meditat ive p lace,

and 8.

  Space

tha t f ac i l i t a tes qu ie t s tudy .

The

  second

 com ponent: might

 be

 labeled Service Affe ct.

This compo-

n en t w as

 saturated respect ive ly with

 the

  i tems

  1.

 W ill ingness

  to help

 users ,

2 .  Giving users indiv idual  a t ten t ion , 4 .  Employees  who are  consistently

cour teous ,

and 3.

  Employees

 w ho

  deal with users

  in a

  caring fashion.

T he  third com ponent might  be  labeled Information

  Access. This

component

  w as

  saturated respect ively with

  the

  item s 10.

  Comple t e

  runs

of

  jour nal t i t les , 11. Interdiscip l inary  l ibrary  needs being addressed,

and

  9.

 C omprehens ive  pr in t  collections.

But

  fo r

  these data

  th e

  p rom ax- ro ta ted r esu l t s r epo rted

  in

  Table

  6.1

would lead

  to the

  same factor labels

  as  those

  produced

  in the

  var imax-

rotated solution.  T he  pat tern coeff ic ients  in  Table  6.1, like  the

  pat tern /

s tructure coeff ic ients presented

  in

  Table 5.5,

  had

  s imple s t ructure .

T he  s tructure coeff ic ients in Table 6.1 ref lect a dynamic invo lv ing  th e

components

  being

  fairly

  highly correlated,

  as

 repor ted

  in

  Table 6.2. Most

of the s tructure c oeff ic ients in

 Table

 6 .1 are qui te notew or thy for a ll var iab le s

O BL I Q U E

  ROTATIONS  AND  H I G H ER - O R D ER  FACTORS  69

Page 71: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 71/185

on a l l components .  This  is mere ly indica t ive of the fac tors having corre la-

t ions  of  .398, .473,  and  .502.

Of course , the com ponen ts a re  sufficiently  discre te tha t i t

  st i l l

  seems

reasonahle  to recogn ize the  ex i s t ence of three la ten t va r iab le s , because even

the two

  mos t h igh ly cor re la ted com ponen ts on ly have 25 .2% com mo n

var iance

  ( . 502

2

  =

  . 2 5 2 ) .

  But for a

  case such

  as

  th is ,

  how  w ou l d  one

  se lec t

a

  f a c t o r an a ly t i c so l u t i on ?

 Three

  re la ted

  issues

  seem

  pa r t i cu la r ly

  re levan t .

Analytic Purpose

In

  som e fac to r ana ly t i c app l i ca t ions

  the

  researcher

  is  p r ima r i ly

  in ter-

es ted in in terpre t ing the fac tors . In other appl ica t ions , fac tor scores are

genera ted fo r use in subs equ en t s ta t i s t i c a l  analyses . As the present exam ple

suggests,  when fac to r in te rp re ta t ion i s the  focus,  rota t ion  choice  may be

less

  c r i t i c a l a s  long  as a  g iven a na lys i s y ie lds a  r e a son ab le s imp l e s t r u c t u r e .

B ut

 w hen com pos i t e scores

 are

 going

 to be

 crea ted

  for use in

 sub sequen t

analyses ,

 and w hen the ob l iqu e fac to rs a re h igh ly cor re la ted , some pre fe rence

migh t

 be

 given

 to the

  ob l ique so lu tion ,

 so

 th a t

  the

  cor re la t ions

 of the

  la tent

var iables wil l b e honored in

 these subsequen t su bs tan t ive ana lyses . Ho wever ,

there is one o ther a ppro ach to es t im at ing fac tor scores , som et ime s ca l led

 unit

weighting,  tha t may finesse  these concerns (see G orsu ch, 1983, pp.

  268—276) .

As  noted  in  chapter  4,  eve ry sample con ta ins some id iosync ra t i c and

nonreproduc ib le va r iance

  as a

 func t ion

 of

 samp ling error.

 The

  more pa r ame -

ters  w e  e s t ima t e ,  the  more po ten t i a l the re  is for our  re su l t s  to  cap i t a l i ze

excess ively  on

  u n i q u e

  fea tures  of the

  s amp l e , w i th

  the

  consequence

  that

resul ts  wil l

  genera l ize less

 w e l l  to new

  samp le s .

 One way to  capi ta l ize

  less

on id iosync ra t i c

 fea tures

  of the sam ple is to com pute fac tor score es t im ates

w i t h o u t

  us ing

  the  fac tor

  score we igh t ma t r ix  ( W v

x

p ) -

One approach

  s imply

  ident i f ies  those m easured va r iab le s tha t

  def in i -

t ively

  mea su re

  a

  single fac tor .

  Then

  composi te scores

  on

  each fac tor

  are

com puted m ere ly by su m m ing the scores on the re leva n t measured va r iab le s ,

by  av era ging these scores , by av eraging the

  z

  scores of only the re levant

var iab les ,

  or by do ing wha teve r e l se the re sea rche r deems though t fu l and

reasonab le . For ex am ple , ac ross  both  the  va r imax - ro t a t ed and the pro 'max-

rota ted resul ts

  for the

  p re s en t e xamp le ,

  the

  s ame m ea su red va r i ab le s wo u ld

be used

  to

  ob ta in un i t -we igh ted compos i t e scores

  on the

  three construc ts .

Parsimony

Centur ie s  ago a  ph i losopher named Wi l l i am of Occam a r t i c u l a t ed the

concept  w e recognize today as the law of pars im ony. Thus, th is precep t i s

somet imes ca l led

  Occam's

  razor. Bas ically,

 the law of

 pars im ony says that

w h e n

  tw o

  exp l an a t i on s

  fi t a set of

  facts  rough ly

  equa l l y

  we l l ,

  a ll

  things

70  EXPLORATORY

 AND

 CONFIRMATORY FACTOR  ANALYSIS

Page 72: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 72/185

equal,  the

  simpler explanation

  is

 more

  l i ke l y  to be

  true.

 We

  tend

  to  p re f e r

parsimonious explanations because things that are true are also more  l i ke l y

to replicate in

  fu tu r e

  research, and researchers

  u s u a l l y

  do not want to be

embarrassed  by

 making discoveries

 that  no

  one  else

  can

  replicate.

In

  an EFA context, orthogonal solutions  are  more parsimonious

 than

their oblique counterparts because  f e w e r parameters are estimated  when an

orthogonal rotation

  is

 implemented. This

 is

 because when

 R

F x

 F

 equals

 Ip

 x F

,

then  PV

 x

 F equals  Sy

  x

 F-

In any  s t a t i s t i c a l ana ly s i s ,

 including EFA, what

  we are

  trying

 to do is

fit a

  model

  to our

  data.

  The

  more parameters

  we

  estimate,

  the

  greater

  is

the likelihood  that  our model

  w i l l

  fit the sample data. For example, if n is

2,  for any two  measured

  v a r i a b l e s ,

 r

2

  is

 a l w a y s

 +1. For a  regression  ana ly s i s ,

when

  the

  number

 of

 predictor variables equals

 n  — 1, the

  degrees

 of

 freedom

error

 is

 zero,

 and the

  resulting

 R is

 always

 +1,

 regardless

 of

 what

 the

 predictor

variables

  are.

The

  same situation applies across

  d i f f e r e n t

  analyses.

  For

  example,

  in

predictive

  discriminant analysis (PDA), when

  less

  parsimonious equations

called quadratic versus linear rules

 are

  used,

  the

  prediction

  hit

  rate tends

to be higher in the original sample, but the prediction  eff icacy  of the same

prediction equation across  other  samples tends to be lower than in the

original sample. Thus, Huberty (1994, p. 64) noted a general preference for

the  more parsimonious  PDA  equations, even though  the fit of the  model

to

 sample data tends

 to be

 better

 for the

  less parsimonious solution. Elsewhere

he  noted

  that

  the linear rule is suggested because of presumed greater

stability

 [across samples]

 than

 that

 for the quadrat ic rule, in general (p. 260).

Interpretation  Difficulty

It  must also  be

  sa id  that

  factor interpretation

  i tself

  is  more complex

when

 factors are correlated.

  This

 is because more parameter estimates must

he

  simultaneously interpreted. Specifically, throughout

  the

  general linear

model (GLM), whenever weights

  are not

  structure coefficients,

 both

 stan-

da rd iz ed  weights  and  structure

  coe f f i c i en t s

  must  be  examined  to  arrive  at

correct interpretations.

This

  is true with reference to analyses including mul tiple regression

(cf.  Courville  &

 Thompson,

 2001;

  Thompson

 &  Borrello, 1985), canonical

correlation analysis (Meredith, 1964;

  Thompson,

  1984), and confirmatory

factor

 analysis (cf. Graham

  et

  al.,

 2003; Thompson,

  1997).

  This is

 also true

in

  EFA.

  As

  Gorsuch (1983) emphasized,  proper

  interpretation of a set of

factors

  can  probably only  occur  if  at  least  S and P are both examined (p .  208).

A variable does not contribute to a factor only if the variable's pattern

and

  structure

 coe f f i c i en t s  are both

 zero.

 If a

 structure coefficient

 is

 near zero,

but the pattern coefficient is not, the measured variable contributes to the

OBLIQUE  ROTATIONS

 AND

  HIGHER-ORDER

  FACTORS

  71

Page 73: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 73/185

solut ion indi rec t ly

 by m odify ing  the

  re la t ionship

 of

 other measured var iables

w ith the

  factor. Across

 the GL M

 this

 is

 called

 a

 suppressor

  e f f e t

  (see Co urv il le

& Thompson,  2001; Horst, 1966).

For  the  Table  6.1  resul t s , the  analysis  does  not  present overwhelming

difficulties,

  because  the  s t ruc tu re is so  clear.  The  three factors

  ( Library

  as

Place, Service  Affect , and  Inform ation Access ) w ere highly correlated

but

  sti l l

  sufficiently

  discre te

  to be

  recognized

 a s the

  constructs described

 by

the pattern coefficients. But the structure  coefficients  in this case  clear ly

indica te  tha t  all 11 variable s are correlated w ith al l three c onstructs. Such

findings

  suggest

  the

  presence

  of

  overarching superordinate const ruc ts ,

  or

higher-order factors.

H I G H E R - O R D E R

  FACTORS

When

  variables  are  correlated (i .e . , R y

x

v  ^  l v \ v ) >

  then

  factors  can

be

 computed

  as

 weighted aggregates

 of the

  measured va r i ab le s .

 By the

  same

token, when  factors  are  correlated (i.e.,  R F

X

F     IF

X

 F ) >

 then  factors  can be

extracted

  from

  this matr ix

 as

 wel l .

The factors extracted   from  interv ariab le correlat ions (o r other stat ist ics

measu r ing  associat ions)

  are

  called

 first-orde r

 factors.

  T he

  factors

  then  ex-

t rac ted

  from

  the inter fac tor correlat ion s among the first-order factors are

called

  second-order

  factors.

  If the

  second-order  factors

  are

  correlated,

  then

third-order

  factors

  can be

  ext rac ted. However ,

  as

  Kerl inger (1984) noted,

  whi le ordinary fac tor

  analysis  is

  probably

  wel l

  unders tood, second-order

factor

  analysis,

  a

  vi ta l ly  impor t an t pa r t

  of the

  analysis,  seems

  not to be

widely  k n o w n

  and

  understood

(p .

 x i v v ) .

It is  argued  here that these higher-order  factors

  should

  be  ext rac ted

whenever fac tors are corre la ted. As Gorsuch (1983) em phasized:

Rotating

 ob l i que ly

  in

  factor analys is

  impl ies

 that

  the

  factors

  do overlap

and that  there  are,

  there fo re ,

 broader areas of  generalizabil i ty  than  j u s t

a

 p r i m a r y fac to r .  I m p l i c it

 in all

 oblique

 rotations

 are

 h ig he r - o r de r  factors .

It

  is recommended  that these  [always]  be

  extracted

  and examined so

that

  the

  inves t iga tor

  may

 ga in

  the

  ful les t poss ible

  understanding of the

data.

  (

P

.  2 5 5 )

Too few

  researchers report ing correlated

  first-order

  factors conduct these

needed higher-order analyses.

The   different  levels of factor analysis give

  different

  perspec t ives on

the

 da ta . Thompson (1990b)

  offered  th e

  analogy

 o f

 hiking

 in the

 m oun t a in s

versus flying over  them. When  you are hiking you can see rocks and trees

and fish in

  streams.  When

  you fly

 over

  the

  mountains,

  you can see the

pat terns made

  by the

  moun ta ins

  as a

  range.

  The

  perspect ives complement

72  EX P LO R A TO R Y A ND  CONFIRMATORY

  FACTOR

 ANALYSIS

Page 74: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 74/185

TABLE 6.3

Second-Order

 Factor Pattern/Structure Coefficient Matrix

First-order factor

  A

  If

FACTJ

  .773 59.8

FACTJI

  .792 62.7

FACTJII

  .832 69.2

each  other, and  each is needed to see patterns at a given level of  specif ic i ty

ver sus

 generality.

The literature includes various examples of higher-order factor analyses

(cf . Borrello  & Thompson,  1990;  Cook, Heath, & Thompson, 2001; Cook

&

 Thompson,

 2000; Thompson & Borrello, 1986). Here

 both

 the implemen-

tation

  of a

 higher-oider analysis using SPSS

  and the

  interpretation

  of

 such

an analysis are illustrated.

Example  1

For the  Table  6 .2  interfactor correlation matrix  (R ^

 x

 5) the  second-

order  factors can be  eas i ly  computed in

 SPSS

 by analyzing the correlations

among

  the first-order

 principal components scores.

 The

  syntax

 is:

subtitle  '5

  n=100

  Faculty

  AAA

 First-order

A A A A A A

 

.

execute .

temporary

 .

select

  if (ranktype eq 3) .

factor

 variables=perl

 to per 2/

analysis=perl to erll/criteria=factors(3)/

extraction=pc/rotation=promax/

print=all/save=reg(all

 reg_pc}/ .

subtitle

  6 Second-order $$$$$$$$$$$$$$$$$$$$$$$$'  .

execute

 .

factor

 variables=reg_pcl to reg_pc3/

print=all .

This  works  only  fo r  principal components analysis, because only  in

principal

  components analysis

  wi l l

  the factor correlations and the factor

score correlations match exactly.

  The

  second-order factor derived

  in

  this

manner is presented in Table

 6.3.

Higher-Order Interpretation Strategies

The  interpretation  of higher-order factors poses some special  dif f icul -

ties. Factors

  are

  abstractions

  of

  measured variables. Second-order factors,

OBLIQUE

 ROTATIONS AND

 HIGHER-ORDER

  FACTORS  73

Page 75: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 75/185

then,

 are abstractions of abstractions even more removed  f rom  the  measured

va r iab le s .

 Somehow

 we

 would like

 to

  interpret

  the

  second-order factors

  in

terms of the measured variables, rather than as a manifestation of the factors

of

  the

  measured variables.

As Gorsuch (1983) suggested,

T o  avoid basing  interpretat ions upon interpretat ions of interpretat ions,

the relationships of the  original variables  to  each  level  of the higher-

order factors  are

 de te rmined .

  . . .  Interpret ing from  th e  variables

  should

improve

  th e

  theore t i ca l  unders tanding

 of the data and

  produce

 a

 better

ident i f icat ion

  of

 each

 higher-order

  factor,  (pp. 245-246)

There are three strategies for doing so.

Firs t ,

 Gorsuch (1983) suggested

  that  the

 second-order factors directly

ref lec t ing  the  measured variables could  be  evaluated  by  computing  the

product matrix  of the first-order

  pattern

  matrix times  the  second-order

pattern matrix  ( i .e . , here

  PH

  x 5

P}

X

 ] = PH

  x

  i) . Second,

 Thompson  (1990b)

reasoned

  that

 an orthogonally rotated product matrix might be  interpreted.

He

  reasoned

  that  if

  rotation

  is

 sensible

  in

  both

  the first and the

  second

order,

 then

 rotation  of the  product matrix to achieve simple structure might

also   be

  usefu l .

Of  course,  in the  present example involving only  one  second-order

fac tor ,

  rotation

 is not possible. Rotation  is only possible when  there are two

or  more factors.

Third, Schmid  and  Leimaii (1957) proposed  an  elegant method  for

expressing

 both the first-order and the

  second-order factors

 in

  terms

 of the

me a su r e d  variables, but also  res idua l iz ing  (removing) all variance in the

first-order factors

  that

  is

 also present

  in the

 second-order factors.

 This

 allows

the  researcher  to  determine what,  if any, variance  is unique to a given level

o f analysis or perspective. Schmid and Leiman (1957) suggested that their

Schmid-Leiman solution not only preserves

 the

 desired interpretation  char-

acteristics

 of the

 oblique solution,

 but also

 discloses

 the

 hierarchical structur-

in g  of the variables (p. 53).

This analysis

 can be

  implemented using SPSS matrix commands.

  The

input into  the analysis includes  (a) the obliquely rotated first-order pattern

matrix (here

  P j i x ? )

  an

d (b) the  second-order pattern matrix (here

  P ?

x

i )

augmented

  by a

  symmetrical (here 3 x 3 ) matrix containing zeroes off-

diagonal

  and the

  square roots

  of

 uniquenesses

 on the

  diagonal.

This augmented matrix, A, is of rank / by s + /, where / is the number

of  f irst-order  factors  and s is the  number  of  second-order factors.  In the

current example, this augmented matrix

  is

 A3

x4

.

R e c a l l

  that

  a communality coefficient

  (h

  )  indicates  in an  area-world

squared  metric the  percentage  of  variance  in a  measured variable

  that

  is

74

  EXPLORATORY

 A ND

  CONFIRMATORY FACTOR  ANALYSIS

Page 76: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 76/185

TABLE

  6.4

Augmented Matrix  A

3 x 4

SECOND

0.773

0.792

0.832

U|

0.634

0.000

0.000

SORT of uniquenesses

Un

0.000

0.611

0.000

U

0.000

0.000

0.554

Note.

  SORT (0.402) = 0.634.

containe d in the fa ctors as a   set. The uniqueness ( i t ) i s an area-w orld

s tat i s t ic

  ind ica t ing the percen tage o f var i ance in a measu red var i ab le

 n o t

con ta ined  in the fac tor s  as a set (e.g.,  100%  -  h

2

 = u

2

, or 100%  -  59.8% =

4 0 . 2 % ) .

  Table

  6.4  presents  the  m a t r i x , A^^ for  this analysis .

The  SPSS

  sy n t ax

 file for the

  p resen t example

 is:

MATRIX .

COMPUTE

 PI =

{0.040,  0.936,

 -0.029

0.917,

 0 031

0.751, -0.026

0.900, 0.050

0.037

0.037

0 . 944

0.805

0.198

-0.070,

0.103,

-0.014,

0.947,

 -0.061,

0.933, -0.049,

0 . 003,  0.003

0.202,

 -0.070

-0.121,

0.700

 ;

-0.026,

  0.010,

  0 851

  ;

-0.096,  0.165,

  0.753}

 .

COMPUTE

 P2 =

{0.773  ;

0.792

  ;

  832} 

COMPUTE

 Pvh = PI * P2 .

COMPUTE

 A =

{0.773, 0.634, 0.000, 0.000 ;

0.792, 0.000, 0.611, 0.000 ;

0.832, 0.000, 0.000, 0.554} .

COMPUTE SL = PI * A .

PRINT

 PI /

FORMAT='F8.3' /

TITLED First-order  Promax-rotated  Pattern

Matrix

/

O B L I Q U E  ROTATIONS

  AND

  H I G H E R - O R D E R

  FACTORS 75

Page 77: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 77/185

Page 78: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 78/185

TABLE

 6.5

Pattern Product Matrix (Pn

x 1

)

Variable SECOND

PER1 .748

PER2 .698

PER3 .653

PER4 .744

PER5  .715

PER6

  .713

PER7 .735

PER8

  .724

PER9 .640

PER10 .696

PER11 .683

TABLE 6.6

Schmid-Leiman Solution for Example 1

First-order

Variable

PER1

PER2

PER3

PER4

PER5

PER6

PER7

PER8

PER9

PER10

PER11

SECOND

.748

.698

.653

.744

.715

.713

.735

.724

.640

.696

.683

FACTJ

.025

-.044

.065

-.009

.600

.592

.598

.510

.126

-.016

-.061

FACTJ I

.572

.560

.459

.550

-.037

-.030

.002

.123

-.074

.006

.101

FACTJ

 II

-.016

.017

-.014

.028

.020

.020

.002

-.039

.388

.471

.417

 x mple   2

The  second

  example invo lves  perceptions

  of

  l i b ra ry s e rv i ce qua l i t y

provided  by

  61,316

  u n i v e r s i t y

  students

  and  facul ty us ing  the 25

LibQUAL+  i tems presen ted  in

  Table

  6.7. Four  first-order  fac tors were

extracted.  For  heur is t ic purposes on ly ,  in  this analysis  two

  second-order

fac tors

  were extracted. The

  Schmid-Leiman

 so lu t ion is reported in Table  6.8.

The richness of higher-order fac tor analy s is can be communicated by

i n t e r p r e t i n g  these  results .

  The two

  second-order  fac tors measure Lib rary

 as

Place ( SEC_B )

  and all

  other  perceptions

  as

  SEC_A.

H o w e v e r ,  the

nine

  Service  Af fec t and the six

  Personal

  Control i t ems pa r t i cu l a r ly

dominate  the  first  second-order

  fac tor .

O B L I Q U E

  ROTATIONS

  AND  H I G H E R - O R D E R

  FACTORS

  77

Page 79: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 79/185

Page 80: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 80/185

TABLE 6.8

Schmid-Leiman

  Solution

 fo r Example  2

(n 

61,316 Students

 and

  Faculty)

Second-order

First-order

Variable

PR1

PR4

PR11

PR14

PR15

PR17

PR18

PR20

PR24

PR2

PR10

PR13

PR21

PR23

PR 3

PR8

PR9

PR19

PR22

PR5

PR6

PR7

PR12

PR16

PR25

Sum of

squares

SEC_A

.655

.624

.690

.635

.660

.704

.713

.610

.689

.200

.206

.262

.328

.260

.526

.566

.559

.416

.512

.628

.685

.708

.736

.761

.625

8.531

SECJ3

.153

.201

.260

.265

.256

.234

.216

.294

.310

.754

.820

.771

.705

.824

.281

.169

.274

.324

.406

.106

.239

.129

.202

.231

.344

4.312

FACTJ

.422

.450

.291

.403

.469

.366

.430

.382

.378

-.025

-.028

.028

.035

.005

-.060

.048

.033

.037

-.052

-.041

.000

-.039

.061

.103

.079

1.494

FACTJ

 I

-.011

-.001

.003

.009

.005

-.006

-.008

.006

.004

.124

.132

.117

.095

.120

-.009

-.018

-.005

.014

.009

-.002

.012

-.001

.003

.003

.010

0.071

FACTJII

.024

-.012

.072

.005

-.022

.020

.004

-.057

-.029

.045

.025

.001

-.023

-.043

.010

.072

.027

-.021

-.044

.364

.306

.399

.306

.260

.068

0.576

FACTJV

-.045

-.062

.041

-.038

-.053

.047

.007

.054

.077

-.064

-.042

-.014

.086

.067

.419

.254

.308

.247

.449

-.019

.033

-.016

.012

.044

.228

0.694

set printback=listing .

DATA LIST

 records=3 /

ROWTYPE_ (A8) FACTOR— (F2.0)

Prl Pr4 Prll Prl4 Prl5 Prl7 Prl8 Pr20 Pr24

Pr2 PrlO Prl3 Pr21 Pr23

Pr3 Pr8 Pr9

 Prl9 Pr22

Pr5 Pr6 Pr7 Prl2 Prl6 Pr25

(1X,9F5.3/11F5.3/5F5.3).

BEGIN

 DATA

FACTOR 1 .655 .624 .690 .635 .660 .704 .713 .610 .689

.200  .206 .262 .328 .260 .526 .566 .559 .416 .512 .628

.685  .708 .736 .761 .625

OBLIQUE  ROTATIONS

  AND  HIGHER-ORDER  FACTORS

79

Page 81: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 81/185

TABLE  6.9

Equamax-Rotated Product Matrix

  n =

 61,316 Students  and  Faculty)

Variable

PR1

PR4

PR11

PR14

PR15

PR17

PR18

PR20

PR24

PR2

PR10

PR13

PR21

PR23

PR3

PR8

PR9

PR19

PR22

PR5

PR6

PR7

PR12

PR16

PR25

A

.653

.622

.688

.632

.658

.702

.711

.607

.686

.193

.198

.255

.321

.252

.523

.564

.556

.413

.508

.627

.683

.707

.734

.759

.622

Second-order factor

B

.159

.207

.266

.271

.262

.241

.223

.300

.316

.756

.822

.773

.708

.826

.286

.174

.279

.328

.411

.112

.245

.136

.209

.238

.350

FACTOR 2

  .153 .201 .260 .265 .256 .234 .216 .294 .310

.754 .820 .771 .705 .824 .281 .169 .274 .324 .406 .106

.239

  .129 .202 .231 .344

END DATA.

FACTOR

MATRIX=IN(FAC=*)

 /

ANALYSIS=prl to pr25 /

PRINT=ROTATION /

CRITERIA=ITERATE(75) /

ROTATION=equamax

 .

T he

  equam ax- ro ta t ed pa t t e rn / s t ruc tu re coe fficients

  for the

  analysis

 are

reported  in  Table 6.9.  In the  present example fac tor in terpre ta t ions would

be  l i t t l e a l te red  by  this rotat ion.

O f  course , th e  produc t m a t r ix would neve r  be  obliquely rotated. That

would

  imply  the  exis tence  of  higher-order fac tors .  In  conduc t ing ob l iqu e

80  E X P L O R A T O R Y AND C O N F I R M A T O R Y  FACTOR ANALYSIS

Page 82: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 82/185

rotations,  always  extract

  the

  progressively higher-order factors imp l ied

 hy

nonzero inte rfacto r correla t ions, unt i l  (a)  there  is only a single higher-orde r

factor o r (b) the factors at the highest level of analysis have s imp le stru ctu re

when ro ta ted or thogonal ly .

M A J O R

  CONCEPTS

If

  s imple s t ruc ture cannot  be obtained with orthogonal rotation, re-

flected in

 m any var iab les having p a t t e rn / s t ruc ture coef fi c ien t s tha t

  are

 large

in

  magni tude

  on two or

  more factors, oblique rotation

  may be

  needed

  to

derive  interpretable factors . However, more parameters must

  be

  est imated

in

  obl ique rotat ion. As a consequence, obl iq ue resul ts not o nly (a) tend to

fit  sample  data  better  but  also  (b)

  tend

  to  yield factors  that  replicate less

well .

  As a consequence, unless obl iqu e resul ts are substa nt ia l ly

 be t te r

  than

orthogonal resul ts , or thogonal resul t may be pre fe r red .

Whenever obl ique

 ro ta t ion  is

 perform ed, higher-order factors

  are im-

plied

  and  should  be  derived  and  interpreted .  The  Schmid-Leiman so lu t ion

is

 part icular ly useful  in such cases. T he

 SPSS

 syntax p resented  in the  chapter

invokes

 SPSS

  MATRIX p rocedure s  and can be  used as a m odel  to perform

these analyses using this  widely  ava i l ab le  software.

OBLIQUE ROTATIONS  AND  HIGH ER-O RDER FACTORS  81

Page 83: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 83/185

7

SIX   TWO-MODE TECHNIQUES

Psychologists  of ten think

  in

  terms

  of

  different  types

  of

  people, such

as wo rkaholics  or Type  A  personalities. Yet the  factor analyses discussed in

previous chapters d o

 n ot

 address question s regard ing types of people Instead,

all the previous discussion

  implicitly

 involved investig ations of the structures

under lying variables.

The

  previous analyses involved

  a raw

  data matrix

  in

  which people

were  represented as data rows and the variables were organized as colum n

fields  wi th in  the  data  matr ix .  When  da ta  are  analyzed  in  this manner ,

intervariable relationships and factors of the measured variables are the

focus  of the  analysis.

But

  data matrices

  do  not

  h av e

  to be

  organized

  in

  this manner.

  For

example, data could be organized such that variables define the rows and

people define the columns. When da ta are organized in this ma nner, interper-

son relationships are the  focus  o f the analy sis, and people factors are the

result. Indeed,  there

  are

  other data structures

 that  are

  possible

  as

 well.

The

  mathemat ics

 of the

  analysis remain

  the

  same across these varia-

tions.

 The

  computer package need

  not

  know what data  features

  are

  being

correlated and analyzed

CATTELL'S  "DATA BOX "

Raymond  Cattell  (1966a) sum ma rized six prim ary exploratory factor

analysis  (EFA) choices  in the  form  of his "data box." Cattell suggested that

83

Page 84: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 84/185

data could involve (a) either one or more than one

  variable,

 (h) either one

or

  more  than

  one  person,  and (c)

  either

  one or

  more  than

  one  occasion of

measurement.

  These  features

  of the data are termed

  modes.

Some data sets might involve several  variables,  several people,

  and

several

 occasions

 of

 measurement.

  It is

 theoretically possible

  to do a

 factor

analysis that simultaneously considers all three  modes. Tucker (1966) pro-

vided some guidance on performing these analyses.

However, three-mode analyses

 are

 extraordinarily complex,

  and

  com-

monly used

  software

  does not support these methods. Thus, only two-mode

analyses are routinely seen in the li terature. Within each two-mode combina-

tion (e.g., variables and people, measured at only one occasion), there are

two

  possible organizations

 of the

  data matrix (e.g., people

  define  the  rows

and

  variables

 define  the

  columns,

  or

 variables

 def ine  th e

  rows

  and

  people

define

  the columns). Thus, there are six two-mode techniques. Each two-

mode technique

  was

 labeled

  by

  Cattell using

 a

 d i fferent

 capital letter (e.g.,

R Q).

Six Two-Mode Techniques

R-Technique

When

  people

  define  the

  rows

  of the

  data matrix,

  and

  columns

  are

used  to represent  different  variables, the  analysis  examines the structure

underlying variables. This analysis is termed  R-technique factor analysis. All

of

  the

  previous examples

  in

  this book were R-technique examples. This

 is

the most commonly used two-mode EFA technique, and is so common

that people invoking  this  analysis often

  do not

  even label

  the

  analysis

R-technique.

Q-Technique

When

  variables

 define

  the rows of the data matrix, and columns are

used  to

  represent

  different

  people,

  the

  analysis

  identif ies

  people factors

(rather  than factors of variables).

 This

 analysis is termed Q technique factor

analysis.  Q-technique is the

  second

  most

  widely

  used  EFA method.

P-

 Technique

When

 variables

 are

 factored,

 and

  occasions constitute

  the

  rows

 of the

data matrix,

  th e

  analysis

 is

 called

  P-technique

 factor analysis. P-technique

m ay involve a single-person study. However,  the  analysis might also involve

means, or medians, summarizing for each variable and each occasion all the

scores as a  single number.

84  EXPLORATORY

  AND

 CONFIRMATORY  FACTOR  ANALYSIS

Page 85: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 85/185

Page 86: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 86/185

of categories with each category co ntain ing exact ly a p rede termined num ber

of objects. The sorted o bjec ts can be anyth ing (e.g., ind ex cards conta ining

statements, photographs

  of

  ar t ) .

  The

  sorting criteria

  can  vary

  (e.g.,

  how

much participants agree  or  disagree with  the  statements, aesthetic valuing

of

  the

  ar t) .

Typically

  the fixed single distributio n is developed to be sym me trical

and

  quasi-normal

  in

  shape.

  An

  exam ple might involve sorting index cards

containing statements into categories

  1

 (m ost agree) through

  7  most  disagree)

using  the

  following

  dis t r ibut ion:

Category  1 2 3 4 5 6 7

n

  o f

 cards

  2 4 6 9 6 4 2

In th is example

 every

  column of data (i .e. ,

  every

  person's data) would

involve exactly two 1's, exactly

 four

  2's, and so forth. The mean of

  every

column would   be  4-00

  SD =

  1.58) with  a  coefficient  of  skewness  of  0.00

and a

  coefficient

  of

  kurtosis

 of

  — 0.50 .

Ranking

In a ranking strategy par ticipa nts are instructed to ra nk order the

objects

  (e.g., index cards with statements) using some criterion (e.g.,

  from

most agreem ent to most di sag reem ent). The appe al of this strategy is that

the method yields grea ter score va rian ce. Because more in form atio n is being

collected, this,

  in

  tu rn , theore t ica l ly

  will

  yield more sensi t ive

  or

  stable

interperson correlations,

  and

  thus better person factors.

  The

  downs ide

 of

the strategy is that  participants may find rank ordering 60 or 100 objects

to be

  tedious

  and

  painful.

Mediated   Ranking

Thompson  (1980a) suggested

 an

  alternative two-stage strategy

 fo r

  col-

lecting ranked data that makes  the  ranking process more manageable  for

participants. First, participants are asked  to  perform a  conventional  Q  sort.

Second,

 the

 part icipants

 are

 then asked

 to

 rank order

 the

 objects with in

 each

category. This means th at fewer objects must

 be

 simultaneously considered

 by

the  part icipants when performing a  given ranking task.

 euristi

Example

The

  hypothetical data presented

  in  Table  7.2 can be

  used

  to

  make

concrete  the  discussion  of Q-technique  interpretation issues. The  example

presumes  that  five students and  four  faculty  rank ordered 20

  features

  of

l ibrary

  service quality

 from  1

 (most  important)

  to 20  least

  important).

 Table

88  EXPLORATORY AND  CONFIRMATORY FACTOR

  ANALYSIS

Page 87: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 87/185

TABLE

 7.2

Q Technique

  Heuristic

  Data

S t u d e n t

  F a c u l t y

I t e m

  1 2 3 4 5 1 2 3 4   I t e m   c o r e

1 1 1 1 6 1 3 1 1 1 1 2 0 1 0

 

2 0 S A   E m p l o y e e s   w h o   i n s t i l l   c o n f i d e n c e   i n   u s e r s

2 1 6 1 3 1 6 1 6 1 6 7 7 9 1 I A C o n v e n i e n t b u s i n e s s  hours

3

  1 3 1 1 1 1 1 3 8 1 1 1 7 I P  S p a ce t h a t f a c i l i t a t e s q u i e t s tu d y

4 8 8 7 8 1 3 9 9 2 0 9 S A E m p lo y e e s d e a l u s e rs i n a c ar in g f a s h i o n

5 1 2 1 2 1 1 2 1 2 1 0 2 0 1 0 1 0 S A   E m p l o y e e s u n d e r s t a n d n e e d s   o f   t h e i r u s e rs

6   7 7 8 1 7 8 8   1 6 8 I A

 C o m p l e t e r u ns

  o f

  j o u r n al t i t l e s

7 1 1 1 2 9 1 1 6 1 6 8 1 7 S A G i v i n g u s e r s i n d i v i d u a l a t t e n t i on

8 1 5 1 7 1 5 1 5 9 3 3 3 1 6 S A

  E m p l o y e e s

  w h o a r e

  c o n s i st e n t l y c o u r te o u s

9 9 9 9 7 1 5 1 7 1 3 1 7 3 I A

  C o m p r e h e n s i v e p r i n t c o l l e c t i o n s

1 0   4 4 4 4 4 1 2 1 2 1 3 1 2 S A

  E m p l o y e e s k n o w l e d g e

  t o

  an s w e r u s e r q u e s t i o n s

1 1

  1 7 1 5 1 7 1 7 6 1 3 1 7 1 2 1 3 P C li b we b

  s i t e e n a b l i n g

  m e

  l o c a t e i n f o

  o n m y o w n

12   6 6 3 6 17 4 4 6 4 LP A

 contemplative environment

1 3

  3 3 6 3 3 6 6 4 6

  L P A

  p l a c e

  f o r  r e f l e c t i o n   a n d

 c r e a t i v i t y

14 14 10 14 19 14 15 15 11 14 1A Timely document delivery interlibrary loan

1 5 1 0 1 4 1 0 1 8 1 0 1 4 1 1 1 4 1 5 S A   R e a d i n e s s   t o   r e s p o n d   t o  users q u e s t i o n s

1 6

  2 0 2 0 18 10 1 9 1 1 1 4 1 5 1 1 P C

  C o n v e n i e n t a c c e s s

  t o

  l i b r ar y c o l l e c t i o n s

1 7

  1 9 1 9 2 0 1 4 2 0 5 5 2 5 L P A   h a v e n   f o r   q u i e t   a n d   s o l i t u d e

1 8   2 5 2 2 5 2 2 5 2

  L P A

  c o m f o r t a b l e

  a n d

 i n v i t i n g l o c a t i o n

1 9

  5 2 5 5 2 1 9 1 8 1 9 1 8 S A W i l l i n g n e ss t o h e l p u s e rs

2 0

  1 8 1 8 19 2 0 18 1 8 1 9 18 1 9 I A

  I n t e r d i s c l i b r a r y n e e d s b e i n g a d d r e s s e d

7.3 presents

  th e

  v arima x-rotated principal components pattern/structure

coefficients from

  this

  analysis.

The  f i r s t  question addressed  by the  analysis  is  "How many types of

people

  are

  there?" Here

  th e

  answer

  was

  two. Presumahly

  th e

  decision

  to

extract two factors was guided hy various evalua tion criter ia (e.g., eigenvalues

greater than 1.0, scree test, parallel

 analysis)

 as it would he in any

 other

 EFA.

TABLE 7.3

Varimax Rotated  Pattern/Structure Coefficients

Person

STUD1

STUD2

STUD3

STUD4

STUDS

FAC1

FAC2

FAC3

FAC4

Factor

I

 96

 94

 85

 8

75

 03

 07

 .09

 13

II

 02

 .01

 12

 23

 .14

 93

 9

74

 76

  ote

Values greater than |.40|

  are

  presented

  in

  italics.

SI X  TWO-MODE  TECHNIQUES  89

Page 88: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 88/185

TABLE 7.4

Salient Factor Scores

 for

 Factor

 I

 Sorted

 by

 Descending Levels

of Agreement

Factor score

Item

  I II

  SPSS variable label

19

18

13

10

7

16

20

17

-1.37

-1.35

-1.27

-1.27

-1.12

1.32

1.48

1.60

1.66

-1.51

-.90

 39

 96

 26

1.52

-1.33

SA

  Willingness to help users

LP A

 comfortable

  and

  inviting location

LP A

 place

  for

  reflection

 and

 creativity

SA   Employees knowledge  to answer user

 questions

SA

 Giving users individual attention

PC

 Convenient access to  library collections

IA  Interdisc library needs being addressed

LP  A haven  for quiet  and  solitude

The

 second

 question addressed  is

  Which

 people belong to the different

types?"

 Here the  five  students saturate Factor  I, and the  four faculty  saturate

Factor  II. The  pattern/structure matrix  has  simple structure. Would  that

real  results were  always  so

  def in i t ive )

The  third question addressed  is  Which  variables were  the  basis for

delineating

  the

  different  person

  factors?"  To

  address this question, factor

scores  are  computed  for  each  variable  on  both  the person  factors.  These

factor scores  are in ?-score

  form,

 as are all  factor scores except standardized,

noncentered

  factor  scores.

The

  variables with factor scores less

  than

  -1.0

  or

  greater

  than

  +1.0

are  more  than  one  standard deviation  in  distance  from  the  factor score

means

 of

 zero.

 These are the

  items

 of

 greatest

  importance or

 least importance

primari ly to the  persons defining the two factors.  Table 7.4 presents  all the

factor  scores  for Factor  I, the  student  person  factor,  for

 which

  factor scores

were  greater  than |1.0|. Table  7.5  presents  all the  factor scores  for  Factor

I I ,  the

  faculty

  person factor, for  which factor scores were greater  than

  |1.0|.

TABLE 7 .5

Salient Factor

 Scores for

 Factor

 II

 Sorted

 by

 Descending Levels

of Agreement

Factor score

Item  I II  SPSS variable label

3

18

17

12

20

19

 26

-1.35

1.60

-.55

1.48

-1.37

-1.55

-1.51

-1.33

-1.28

1.52

1.66

LP  Space that facilitates quiet study

LP A comfortable  and  inviting location

LP A haven  for  quiet  and solitude

LP A contemplative environment

IA  Interdisc

  library needs being

  addressed

SA Willingness  to help users

90

  EXPLORATORY

 AND

 CONFIRMATORY FACTOR

  ANALYSIS

Page 89: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 89/185

In the present exam ple, lower rankings (e.g., 1, 2) indicated items

ranked most important.

 And all the

  noteworthy pattern/structure  coefficients

in

  Table

  7.3 are

  positive.  This  means  that  factor scores

  in

  Tables

  7.4 and

7.5  that

  are

  smallest (i .e. , biggest nega tive value s) were items deemed most

important, whereas factor scores

  that

  are

  largest were items deemed least

important.

The

  factor scores

 for

 Factor

  I

  presented

  in Table  7.4

  suggest that

  the

s tudents were most concerned that library staff are service oriented  and  tha t

the

  library

  is a  comfortable place  for  reflection.  At the  other extreme,

the

  students were least concerned  about

  the

  library being quiet

  and

  about

library  collections.

The

  factor scores

  fo r

 Factor

  I I

 presented

  in

 Table

  7.5

  suggest

 that  th e

faculty

  were most concerned about

  the

  library being quiet

  and

  least con-

cerned about

  the

  helpfulness

  of

  library

 staff  and the

  interdisciplinary needs

of users being ad dressed.

 These

 hypothe t ica l

 faculty

 only care about w hether

the library coverage in their own d isciplines is ad equate

Variables with large factor scores and   different  signs across

  factors

reflect

 w hat v iews

 part icularly differentiate

  the two person

 factors.

  For exam-

ple,

  s tudents want l ibrary

  staff  to be

  helpful

  (F[ =  -1.37),

  whereas this

  is

less

  important

  to

  faculty

  (Fu =

  +1.66). However,

  both

  s tudents

  and  faculty

w an t

  th e

  library

 to be

 com fortable

 (Fj =

 -1.35

  and  FH = — 1.51),

 even  though

they disagree wh ethe r this in volv es quie t

  (Fj =

  +1.60

  and FU = -1.33).

MAJOR

  CONCEPTS

In

  many research situations, factoring variables

  via

  R-technique

  ad-

dresses im port ant issues, such

  as

 bet ter unders tand ing

  th e

  nature

  of

  intelli-

gence  or

  self-concept.

  But  both

  people

  and

  t ime

  can

  also

  be

  factored.

Factoring people

  is

  often

  of

  interest

  in

  psychology, because psychological

theory often is abou t type s of people (e.g., Type A perso nalitie s).

Q-technique  factor analysis is the second-most comm only used EFA

method ,  after  R-technique.

  In

  Q-technique, d eciding

  how

  many people

factors

  to

  extract addresses

  the

  question "How many types

  of

  people

  are

there?" across

  th e

  variables used

  in a

  given analy sis. Persons' pa ttern

  or

s t ructure

  coefficients large

  in

  m a g n i t u d e

  (plus  or

  min u s )

 on a

 given facto r

can be

  examined

  to

  de te rmine

  Which

  factored people belong

  to a

  given

type?"  And the

  factor scores  that

  are

  large

  in

  magnitude

  can be

  examined

to

  de te rmine

  Which

  variables were

  the

  basis

 for

  del ineat ing

  the  different

person

  factors?"

SIX

  TWO-MODE TECHNIQUES

  91

Page 90: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 90/185

8

CONFIRMATORY ROTATION   A N D

FACTOR  INTER PRETA TION ISSUES

There  are

  var ious s i tua t ions

  in

  which

  the

  researcher

  m ay

  w a n t

  to

evaluate

  how

  we l l

  tw o

  di f feren t

  factor structures

  f it

  each other.

  J u s t

  as

unrelated factors  c a n n o t  be readi ly in terpre ted, as  reflected  in the Table

3.6

  results (see chap.

  3),

  factors across samples often  can n o t

  be

 accu ra te ly

compared unless the two structures are compared   analytically  using Procrus-

tean rota t ion methods.

These  m eth o ds  rotate  one  st ruc ture  to a  best-fit posi t ion w ith  a

second target structure.

  If the two

  factor matrices

 do n o t

  match under these

condi t ions, they  can n o t

  f it

  u n d e r

  an y

  other condit ions,

  and the

 s t ruc tures

s imply

  are

  incompat ib le .

  Some

  example applications

  of

  these methods

  in -

c l u d e

  empir ica l ly evaluat ing

1.  w h e th e r  the factor pattern /struc ture matrices for a given set of

variables across di f feren t  samples

 are

  invariant across samples;

2 .

  whether there

  are

  substantively based differences

  in

 s t ruc ture

fo r  a set of

  i tems across groups (e.g. , graduate students

  vs.

facul ty) ;

3.  whether  the  st ruc ture is replicable using  in ternal replicabil i ty

( T h o m p s o n ,  1994, 1996) cross-validation methods (i .e . ,

r andomly sp l i t t i ng

  the

  sample

  to

  in form judgment s abou t

replicabil i ty);  and

93

Page 91: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 91/185

4 -   the factor adequ acy of an obtained pa ttern /stru ctu re m atrix

by

 eva lua t ing

 its fit to a

  theoretical ly expected target stru ctu re

consist ing   of

 ones

  (o r  negative ones)  and  zeroes  (Thompson

&

  Pit ts, 1982).

Thompson (19 92 b) provid ed a part ial stat ist ical significan ce test d istr ibu tion

for  evaluating  these  factor correlations across samples.

  BEST-FIT

ROTATION

  E X A M P L E

For  heur is t ic purposes,

  th e

  invariance

  of the

  pr inc ipal components

var imax-rota ted pa t tern /s t ruc ture coeff ic ients involv ing percept ions  of li-

brary

  service qual i ty w i th regard to 11 var iables (PE RI through P E R U )

wil l

  be  evaluated.  T he  comparison involves  th e  st ruc ture  for 100

  facul ty

reported   in T ab le  5.5  (see chap.  5)  being compared wi th  the  corresponding

s t ruc tu re

 of 100 graduate s tudents (see Append ix  A). The  gradu a te s tuden t s'

s t ruc tu re  wi l l  be

  used

  as the

  target matrix

  for

  this analysis.

Ve ldman (1967 , pp .

  2 38 — 2 4 4)

  expla ined the mathemat ica l theory

under ly ing

  the  method .  H e

  noted that

  th e  cosines among  the  factors across

the two

 solut ions equal

 the

  factor correlat ions across samples.

 H e

  also  no ted

that this matrix of factor cosines can be used to rotate a  second  mat r ix to

its  best-fi t posi t ion with  a  target matr ix .

The cosine m atri x is comp uted using the m atrix algebra fo rmu la:

C j

x

i  =  ( A ' i

x

K   B

K x

j ) ' j

x

|

 V j

x

j

  E

A x

j ~   •  V ' j

x

| ,

where  I = the  n u mb e r  of  fac tors  in the A  target  pat tern  matr ix;  J = t he

num ber of fac tors in the B pat tern matr ix; and K in the nu mber of var iables

in the two  analyses. This  algori thm requires first the  computa t ion  of

Q i x i  =

  A ' |

x K

  B

K x

j ( A '

l x K

  B K

X

J ) J

X

I .

The  matr ix V |

  x

 j is the  mat r ix of pat tern  coefficients  extracted from Q [

 x

 j The

matr ix E

A x

 j has

  zeroes

 off the

  d i a g o n a l

 and on the

  diagonal

 th e

  e igenvalues

 of

Q i x i .

This  analysis  can be  conducted  in  SPSS  using  th e  syntax presented

in

  Ap p e n d i x

  B. By the

  way, this syntax invokes

  th e

  MATRIX procedures

avai lable  in  SPSS,  i f one  learns  how to  type syntax rather  than  execut ing

SPSS  via point and c l ick. In App end ix B, a l l the syntax between the

MAT R I X c o mma n d  a nd t he EN D  M AT RIX command execu tes ma t r ix

algebra.  T he  adventurous reader  can  access  an  SPSS  ma n u a l  to  discover

how   to command

  SPSS

  to execute any  matr ix  procedure  (e.g.,  transpose,

inversion, mul t ip l ica t ion)

  o r

  matr ix formula .

9 E X P L O R A T O R Y A N D  CONFIRMATORY FACTOR  ANALYSIS

Page 92: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 92/185

TABLE

 8 1

Factor Correlations (Cosines)

Student

target

F CTJ

F CTJI

F CTJII

F CTJ

-.007

1.000

.022

Faculty matrix

F CTJI

1.000

.006

.024

F CTJII

-.024

-.022

.999

Table 8.1

 presents

  the

  factor correlat ions across

  the two

 samples .

  The

resu l t s indicate  that  the  factors were h ighly  congruent  or  invariant across

samples (e.g. , r^

  x

 u g =  1.00),  al though the order  of  the factors was

  d i f f e r en t

in the two

  samples (e.g. , Factor

  I in the

  s tuden t sample

  w as

  Fac to r

  II in

th e  facu l ty

  s a m p l e ) .

Table 8 .2

 presents

  th e

  fac to r pa t t e rn / s t ruc tu re m at r ix

 of the 100 f acu l ty

after

  rotation to

  best

 fit

w i t h

  the

  s t ruc tu re

  of the 100

 gradu ate s tudents .

The  s t ruc tu res  are  very similar across  the two  groups  in  th is  data  set .

The  p rogram also produces cosines ( i .e . , here correlations) betw een

the

  var iables

 in the

  target pat tern /s t ructure matr ix

 and the

  var iables

  in the

best-f i t  ro ta t ed pa t t e rn / s t ruc tu re c oe f f i c i en t ma t r ix .  These  cosines among

variables should   all be  reasonably large  to  ind ica te

  that

  the two  solutions

fit  reasonably wel l  at the  level  of each of the  var iables , and not  jus t  at the

factor level .  If  some var i ab les ' cosines  are  smaller , these  are the  var i ab les

that

  could

  not be

  related wel l

  in the

  best- f i t

  rotation.

For

  th e

  Table

  8 .2  d a t a ,  th e  lowest  of the 11 va r iables ' cosines  was .94

for  variable PER9,  and the 10 remaining

 cosines

 were all .98 or larger.

 These

resul t s  con f i rm fac to r  f it  even across  the 11  measu red var i ab les .

TABLE

 8 2

Pattern/Structure Matrix

  for 100

  Faculty Rotated

  to

  Best-Fit Position With

th e

  Pattern/Structure Matrix

  for 100

 Graduate  Students

Factor

Variable

PER1

PER2

PER3

PER4

PER5

PER6

PER7

PER8

PER9

PER10

PER11

FACT_I

.890

.868

.726

.866

.115

.124

.167

.315

.087

.209

.321

FACT_II

.204

.109

.231

.164

.9 04

.892

.905

.792

.323

.166

.105

F CTJII

.213

.239

.185

.266

.240

.239

.223

.170

.677

.801

.730

C O N F I R M A T O R Y R O T A T IO N

  A N D

  F A C T O R I N T E R P R E T A T I O N

95

Page 93: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 93/185

F A C T O R

  I NTER P R ETATI ON CONSIDERATIONS

These

 results afford

  an

 opportunity

  to

 state some general principles

 of

resul t

  interpretation. The statement of these  principles requires  speci fy ing

which

 aspects

 of

 resul t s

 are and are not

 relevant

 to the

  interpretation process.

Factor Order

Both

  hefore and

  af ter

  rotation, factors are ordered by the amount of

information they

  contain.

  Prior  to  rotation,  the  eigenvalues characterize

the  amount  of information

  contained

  in the  unrot ted factors. For example,

if the  first eigenvalue associated with the first principal

 component

 was 3.0,

and 11 variables were included in the R-technique analysis, the result

indicates  that  27.2% (3.0  / 11) of the  variance  in the  variables  can be

reproduced

  by the first

 component.

Factor  rotation  redistributes  the  variance across  the  factor  in  service

of pursuing simple structure. The  eigenvalues are not relevant  to evaluating

the

  information contained

  in

 particular rotated factors. Computer packages

print sums

  of

  squares

  to

  address

  the

  issue

  of the

  information

  contained

within  specif ic  rotated factors.

In

  general,

  the

  order

 of the

  factors

 is not of

 interest

  in

  evaluating

  the

solution.

  For

 example,

 here  the

  analyst expected

  three

  factors

  in both  the

student  and the  f acu l ty  samples: Service  A f f ec t , Information

  Access,

and  Library as

 Place.

These  dimensions  did  emerge  in

  both

  samples.  No

particular factor ordering

 was

 theoretically expected

  in

  this analysis.

 It

 does

not

 matter that

  the

 factors emerged

 in a d i f f e r en t

  sequence

  in

 these analyses.

In

  some analyses

  there

  might

  be an

  expectation

  that  one

  factor  w i l l

dominate

  the

  factor space (i.e., contain

  a

  hugely disproportionate amount

of

  variance).

 Only

  in such cases would a theoretical expectation regarding

f ac t o r

  ordering

  be relevant.

  Reflection of Factors

Some factors involve pattern

  an d

 structure coefficients with

  d i f f e r e n t

signs. Because most people

  find it

  easier

 to  think  in

  positive terms,

  if the

larger

  pattern

  and

  structure coefficients have negative signs

  on a

  given

fac tor ,  it is completely appropriate for the analyst to reflect the factor by

reve rs i ng  the  signs of the  coefficients on any given factor (Gorsuch, 1983,

p.  181).

This

  is

 appropriate because psychological variables typically have

  no

intrinsic scaling

 direction. For

 example,

 an

 achievement test could

 be

 scored

by

  counting either

  th e

  number

  o f

  right answers

  or the

  number

  o f

 wrong

answers.

96

  E X P L O R A T O R Y

 AND CONFIRMATORY FACTOR  ANALYSIS

Page 94: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 94/185

How ever , w hen a g iven fac to r i s r e f l ec t ed , al l the signs of al l the patte rn

and the  s t r u c t u r e  coef f ic ien ts  on the  fac to r mus t  be  rever sed .  I t i s not

necessary  to  a c k n o w l e d g e  the  reflect ion process  in  descr ibing resu l t s .  And

in a  g iven so lu t ion ,  any  c o m b i n a t i o n  o f  d i f f e r en t  fac tors  m ay b e r e f l e c t e d , '

i nc lud ing ,  o f  course,

  none.

Salience  of  Variables

It  is common  to see researchers invok ing some c r i ter ion  to  de te rmine

w hat va r iables are sal ien t to the in terp retat ion o f a g ive n facto r . V alues o f

pat tern /s t ructure coeff ic ien ts

  of

  |.3|, |-35|,

  and

  |.4|

  are

  used wi th some f r e -

q u e n c y . Such  v a l u e s

 m a y h e  useful  in

  se t t i ng

  th e

  m i d p o i n t

  o f

  gaps b e tween

smal ler  an d  larger coeff ic ien ts  o n a  g iven fac to r .

Researchers

  m ay

 u n d e r li n e

  o r

  ita lic ize

  th e

  ent r ies

  in

  pa t t e rn / s t ruc tu re

matr ices that  meet thei r cr i ter ia , to  help readers see pat terns .  Or  researchers

m ay   sort  th e  rows  of the  mat r i ces  by the  va lues  of the  biggest abso lu te

coef f ic ien ts  on

  d i f f e r en t

  factors .

Some  researchers  blank  ou t nonsa l i en t coe f f ic i en ts  to  make s imp le

st ructure more apparent . Unl ike i ta l ic iz ing

 coef f ic ien ts

  o r sor t ing factor rows,

omi t t i ng nonsa l i en t coe f f i c i en t s  is  n ot  good pract ice . Doing  so  dep r ives

other

  researchers of the  oppor tun i ty  to conduct  secondary analyses , such as

i nvok ing   d i f f e re n t  ro ta t ion methods ,

  o r

 c o n d u c t in g  best- f i t  ro tat ion

  o f

  their

resul t s

  w i t h  yours .

It

  is  impor tan t  not to ge t too  rigid  in  determining sal ience cr i ter ia .

Use a  sal ience cr i ter ion  in a  g i v e n s t u d y that  does  a  good  job for the  g iven

resul t s  in h igh l igh t ing s imp le s t ruc tu re .

Naming

  actors

An  impor tan t  part of the  interpretation  process  is n a m i n g  the  factors .

This

  is

 where

  the art of the

  fac tor analy t ic process

  is

 exercised . Factor names

are  usual ly one-  o r  two-word phrases .

Factors should

  never

  be named  af ter  mea sured var iables . A fac tor i s

inherent ly an aggregat ion o f several measured factors . A factor ref lected in

only  a  s ing le measured var iable ( i .e . ,  a  speci f ic factor )  is  inheren t ly  n o t

a t rue factor . Thus,  a ll  real  fac tors  invo lve mu l t ip l e var i ab les ,  and to  th is

extent  m u s t

 be

 n a m e d

  in a

 man ner r e f l ec t ing

 th e

  o veral l pat tern

  o f

 c o n t r i b u -

tion

  o f  di f fe ren t

  variables

  to the

  factor ' s def in i t ion

  (o r

  conversely ref lect ing

th e  unde r ly ing cons t ruc t 's ma n i fes t a t ion  v ia the  measu red var i ab les ) .

Factor naming also must  be  in fo rmed  by the

  d i f f e r en t i a l

  sa tu ra t ion  o f

a

  g iven factor

  by

  g iven factored en t i t ies .

  F or

  example ,

  in an

 orthogonal

  R -

technique analys is , i f one var iab le has a pat te rn /s t ru ctu re coeff ic ien t o f +.7

o n a

  factor ;

  and  another

  var i ab le

  has a

  pa t t e rn / s t ruc tu re coe f f i c i en t

  o f

 + .5 ,

C O N F I R M A T O R Y R O T A T I O N A N D F A C T O R IN T E R P R E T A T I O N  97

Page 95: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 95/185

do not  forget tha t the first va r iab le has 49% of its va r iance  in c o m m o n w i t h

the  f ac to r , whereas  the  second measured var iab le  has  on ly  25%  c o m m o n

variance .  Thus,  the

  first

  var iab le should  have  rough ly doub le in f luence on

selection  of the  factor label.

Fac to r names mus t  be

  suff iciently

  broad  to  cap ture  all of the  p r i m a r y

relationships suggested

 by the

  pa t te rn

  and

  s t ruc ture coef f ic ients .

  But  there

is

 ano the r cavea t

 that

  m u s t

 be

 m e n t i o n e d .

  When

  conduct ing fac tor analyses

and

 com paring resul ts w i th those

  in

 pr ior re la ted s tud ies ,

 it is

 impor tan t

 to not

p lace undue re l i ance

  on

  construct names assigned

  by

  previous researchers .

Resea rche rs may have nam ed the i r f ac to r s in ways tha t d o not seem  ins igh t fu l

or even reasonable. If the fa ctor labels from pr ior s tud ies are accepted w i tho ut

scru t iny ,

  inva l id compar i sons wi th re la ted p r io r  f i n d i n g s m a y re su l t .

M A J O R  C O N C E P T S

In

  va r ious s i tua t ion s re sea rche rs may want to d e te rmine how we l l two

d i f f e ren t

  f ac to r s t ruc tu res

  for a

  g iven

  set of

  fac tored ent i t ies (e .g ., var iab les

if

  R - t ec h n i q u e

  is

  u s e d )

  f it

  each other.  Empi r ica l me thods  usua l ly  m u s t

  be

used for

  this purpose , because ap parent

  d i f ferences  may be

  method  ar t i fac ts ,

and emp i r ica l me thods p rov id e a quan t i f i ca t ion o f degree o f s t ruc tu re cor re -

spondence.

 The

  A p p e n d i x

  B

 SPSS syntax

 can be

 used

 as a

 mode l

 to

 conduct

such  best-f i t  rota t ions .

In  ev aluat ing correspondence of fac tors across samples or subsam ples ,

factor

  order

  is usual ly  not a

  cons idera t ion. Sim ilar ly ,

 a

  given fac tor across

samples

  may be

  or ien ted

  in

  oppos i te d i rec t ions , such  that

  the

  coef f ic ients

for

  a

  fac to r

  in one

  sample mus t

  be

  ref lec ted

by

  changing

  a ll th e

  signs

 o f

a ll  coefficients

  on the fac tor . Factor or ienta t ions are comple te ly  arbi t ra ry ,

but we

  usually prefer

  the

  predominance

  o f

  l a r g e m a g n i t u d e

  coefficients  to

be

  posit ive, because most people

  find it

  eas ier

 to

  th ink pos i t ive ly .

When

  present ing fac tor s t ruc tures , researchers of ten highl ight

  coeffi-

cients  that they deem sa l ien t . Such salience criteria

 are

 usua l ly

 set by

 looking

for  gaps be tween near-zero and no nzero co ef f ic ients (e .g . , absolute .4) in a

given solut ion. Sal ience cr i ter ia should not be used wi th excess ive

 r ig id i ty .

Factors are synthet ic var iab les re f lec t ing un de r lying la tent cons tructs ,

and therefore should never be named   af ter  measu red var iab les . Eq ual ly

impor tan t , when read ing p r io r re sea rch ,

  do not

  accep t

  the

  factor labels

 of

others at

  face

  va lue . When  conduc t ing  either  sub jec t ive or  empir ica l fac tor

meta-analyses  (Thompson,  1989),  focus  on the  inva r iance  of the  f ac to r

meanings across s tud ies , notwiths tand ing

  th e

  labels assigned

  by

  d i f f e r en t

researchers.

98  E X P L O R A T O R Y AND C O N F I R M A T O R Y F A C TO R

  A N A L Y SIS

Page 96: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 96/185

9

I N T E R N A L  REPLICABILITY  ANALYSES

In

 quantitative research,

  all

 statist ical methods, including factor analy-

sis,

  are susceptible  to the  influences of

 three

 kinds of errors: sampling error,

measurement error,

  and

  model specification error.

  As

 noted

  in

  chapter

  4,

sampling  error

  is

 that  portion

  of

  variance

  in

  sample  data

  that  is

 unique

  to

a  given sample,  and in

  effect

  reflects  the  personality of a  given sample.

B y

 definition,  the  sampling error  in a  given sample does  not  exist  in the

population.

 e surement

  error  is the

  variance

  in

  data  that

  is due to

  inherent

imperfec t ions   in measurement  (Thompson, 2003). We  estimate these  inf l u -

ences  by computing ratios  of  reliable variance  to the  total observed score

variance; these ratios

  are

  called  reliability

  coefficients.  In

  every study

  it is

important to  estimate  the  reliability of the  data  in

  hand

  in the  study and

not

  merely cite reliability coefficients

  from

  the

  manual

  or

  prior studies

(Vacha-Haase, Kogan,  &

  Thompson,

  2000).  As  Wilkinson  and the  Task

Force  on  Statistical Inference  (1999)  emphasized:

It

  is important to  remember that  a  test  is not

  reliable

  or  unrel iable.  . . .

Thus ,

  authors should provide

  re l iabi l i ty

  coefficients

  of the  scores  fo r

th e  data  being  analyzed  even  when  th e  focus  of  their research  is not

psychometric , (p.

 596)

Model  specification  error

 is the

  failure

 to use all and

 only

 the  correct

 variables,

or the  f a i l u r e to use the  correct statistical analysis, or

 both,

  thus obtaining

99

Page 97: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 97/185

incorrect resul ts . Of course, as Ped haz ur (1982 ) has noted, The rub , how -

ever,

  is

  that

  the

  t rue model

  is

  seldom,

  if

  ever, known

(p.

  229 ) .

  And as

Du ncan (19 75 ) has noted, Indeed it wou ld req u ire no elaborate sophistry

to show that we

  wi l l

  never have the 'r ight ' model in any absolute sense

(p.

  101).

Sampling error  is par t icular ly pernic ious in  mu l t ivar iate analyses , be-

cause more complex analyses afford  more opportunit ies  fo r  these influences

to be

 m anife s ted . With respect

  to

 exploratory factor  analys is (EFA ), Gorsuch

(1983) noted,  In factor analysis , one has nu merou s poss ibi l i t ies f or

 capital iz-

in g  on chance. Most extraction procedu res, inclu ding principal factor

solu t ions ,

  reach the ir criterion by such capital iz at ion. The same is tru e

of  rotat ional procedures, including those that rotate for simple structure

(p.  330) .

There

  are two

  major ways

  of

 empir ical ly address ing resul t repl icabi l i ty

quest ions:  external  repl icabi l i ty analyses  an d  internal  repl icabi l i ty analyses

( H u b e r ty ,  1994;

  Thompson,

  1996). Ex ternal repl icab i l i ty s tudies involve

collect ion of independent data  from  different  par t ic ipants . These  are true

replicat ion studies and  provide  the  best evidence  of  resul t repl icabi l i ty by

direc t ly

  and  exp l ic i t ly address ing repl icabi l i ty concerns .

However, many researchers

  find

  themselves unable

  to

  a lways conduc t

external

  repl icabi l i ty s tudies .  In  these situat ions researchers  may use  some

combination  of  three  major  logics  for  internal repl ica bi l ity analyses: cross-

val idat ion,  th e  jackkn i fe ,  and the  bootstrap (Thompson, 1993b, 1994).

These

  three methods combine the

  subjects

  in hand in

  different  ways

  to

dete rmine wh ether resul ts are stable across sample variat ions, that  is, across

th e  idiosyncracies  of  in d iv id u a l s that  make general izat ion  in  social science

so  challeng ing (Thomps on, 1996,

  p.

  29 ) .

Here  both  EFA cross-validation and bootstrap methods  wil l  be il lus-

trated. These  are, respectively,  the  simplest  and the  most complex internal

replicabil i ty   methods. Jac kk nife applicat ions have been discussed elsewhere

( H u b e r t y ,

  1994).

Across  the  three major internal re plic ab il ity analys is choices  it is

important  to  remember that  none of the  methods  is as useful  as true replica-

t ion

  s tudies .

 A s

 Thompson

  (1993b) noted, because

all  three  strategies  are  typical ly based  on a  single sample  of  subjec ts  in

w h i c h  th e  sub jec t s  u s u a l l y

  h a v e

  m u c h  in  common  (e .g. ,  point  in  t ime

of

  measu rement , geographic or igin) re lat ive

  to

  what  they wou ld  have

in common

 w i t h

  a separate

 sample ,

  the three

 methods

 a l l

 y ie ld somewhat

inf lated

  es t imates of repl ica bi l i ty . Because inf lated es t imates of repl ica-

b i l i t y

  provide  a  bet ter es t imate  of  rep l icab i l i t y than  no  estimate  at all

( i .e . ,  s t a t i s t i ca l s ign i ficance t e s t ing) ,

 these

  procedures  can  st il l  b e  usefu l

in

  focusing on the  sine

 qua non

 of science,  (pp. 368-369)

 

00  EXPLORATORY A ND

  C O N F I R M A T O R Y F A C T O R  ANALYSIS

Page 98: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 98/185

Cross-validation and bootstrap applicat ions  wil l  be il lustrated here using

for  heu r is t ic purposes  the first 30  cases  of data  and  var iab le s PE RI through

PER8  from  Appendix  A .

EFA

  CROSS-VALIDATION

Cross-validation methods executed

  as

  internal repl icabi l i ty analyses

randomly split the data into tw o subgroups. Analyse s are conducted sepa-

rately

 w ithin each sub group, and then  an  empirical  analys is must be execu ted

to   quant i fy  the degree of fit across the two analyses. These  methods have

been used across the general l inear model ,

  from

  reg ression to canonical

correlation analyses (cf. Grossman, 1996;

 Thompson,

  1994).  The  methods

are

  logical ly straightforward

 and do not

  requ ire specialized sof tware .

Table

  9.1

  presents

  th e

  cross-validation su bgrou p assignments ( 1

or

  2 )

  for the 30

  cases produced

  by

  flipping

  a

 coin.

  For

  logistical purposes

 as

is typical ly done, becau se keeping t rack

 of the

  subgroups

 is

 thereby faci l i ta ted,

here  the 30  cases were split 16/14 rather

  than

  15/15. Table  9.2 presents  th e

pat tern/s t ruc ture matr ices

  from

  principal components analyses rotated to

the varimax criterion for both subgroups.

T he  best-f i t  rotat ion procedure must  then  be  invoked  to  al ign  th e

factor  s t ruc tures

 to

  opt imal

 fit.

 E ither so lut ion

  can be

  used

 as the

  target

 f or

this analysis, al though some preference migh t be afforded  to using the larger

sample  as the  target .

 Table

  9.2 also presents  th e

  second

  solut ion rotated  to

best-f i t

 posi t ion w ith

 th e

  so lu t ion

 for the f irs t

 subgroup .

 In

 this

 smal l

 heur is t ic

example

  th e

  solut ions

 in the two

 subgroups

 are

 reasonably congru ent.

 This is

reflected  in the

 cosines

  of the two

 solu t ions

 vir tual ly

 being

 an

  ident i ty matr ix.

The biggest disparity across the  subgroups  involves the variable PER8,

which  in the  second subgroup had  pat tern/s t ruc ture coeff ic ients  on  Factor

 

of

  .667

  and on

  Factor

  II of

  .596.

  In the first

  sub group, PER8

  had a

 t r iv ial

pat tern/s t ruc ture  coefficient

  on

  Factor

  I

  ( .158

2

  =

  2 . 5 % )

  but a

  huge va lue

on

  Factor

  II

  (.840

  =

  70 .6%) .

 This

  dispari ty

  is

 also reflected

  in the

  cosines

for

  the eight variable  pairs

  each

  being greater  than  .94, except for PER8,

which  had a  cosine  of  .808.

Cross-validation can g ive some insight regarding potential resul t

 repli -

cabi l i ty .

 If

 solu t ions wil l

 n ot

  replicate across sub groups

 in an

 inte rnal replica-

bi l i ty  analys is involv ing a single sample,

 then

  replicat ion with  a

  t ru ly

  inde-

pendent second sample seems part icularly un like ly.

And i t

  should

  be

  emphasized that

the

  fu l l

  sample resul t s  are  a lways  the  basis  for our  interpre tat ions .  T he

subsample resul t s  are

  only

  employed  as a  vehic le  for

  exploring

  th e

repl icabi l i ty

  of the  fu l l

  sample results .

  T he  f u l l

  sample results

  are

  used

INTERNAL

  REPLICABILITY ANALYSES

  101

Page 99: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 99/185

TABLE

 9.1

Cross-Validation

  Random

 Subgroup  Assignments and  Bootstrap

Resampling  Assignments

C r o s s v a l i d a t i o n

Case

 

3

4

5

6

7

8

9

 

2

 3

 4

 5

 6

 7

 8

 9

2

2

22

23

24

25

26

 7

28

29

3

S u b g r o u p

 

Draw

 

3

4

5

6

7

8

9

 

2

 3

 4

 5

 6

 7

 8

 9

2

2

22

23

24

25

26

 7

28

29

3

29

 8

 3

24

 3

 7

 5

 

6

 

22

2

26

 3

 4

5

 2

22

 5

 

26

 4

 5

 4

 8

 

24

22

25

R e s a m p l e

2

3

4

 2

7

3

25

 7

 7

28

6

 4

 

9

3

5

 7

2

24

 

6

28

 5

 

2

5

28

25

23

3

2

27

26

2

7

 4

23

 

8

29

6

3

 

3

 5

 5

28

4

 9

3

24

26

 

9

 9

 

4

 6

 9

 8

4

25

 9

 

7

25

 4

4

23

8

 2

 3

 

8

 7

 

7

3

8

22

 5

 

9

3

7

28

5

 

fo r  interpretation, because we tend  to vest  more  confidence in  resu l t s

as  sample

  size

  increases,  and the  fu l l  s ampl e

  analysis

  involves

  more

subjec ts

  than  do the  s u bgr ou p ana l ys es .  (Thompson,

  1994,

  p.

  1 7 2 )

The  subgroups are only used a s a vehic le  to evaluate the  possible replicability

of  th e  factor solution  for the  complete sample.

EFA

 BOOTSTRAP

The  bootstrap is an  elegant computer-intensive internal replicab il ity

analysis conceptualized  fo r  several analyses  by  Efron  and his  col leagues .

Diaconis

  an d

  Efron (1983) provide

  an

  accessible art icle su mm ariz ing this

1 02  E X P L O R AT O R Y

 A N D

  C O N F f R M A T O R Y FA CTOR  ANALYSIS

Page 100: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 100/185

TABLE 9.2

Varimax-Rotated Factor Pattern/Structure Matrix for Tw o Subgroups

(n ,  = 16;  n

2

  =  14)

Subgroup 1

(target)

Variable

PER1

PER2

PER3

PER4

PER5

PER6

PER7

PER8

I

.896

.954

.784

.877

.376

.326

.261

.158

II

.340

.110

.474

.354

.792

.845

.906

.840

Subgroup

  2

 

.929

.942

.809

.596

.345

.356

.228

.667

II

.214

.250

.496

.487

.898

.890

.926

.596

Subgroup 2

after

 best-fit

I

.923

.935

.796

.583

.322

.333

.205

.657

II

.238

.274

.576

.502

.906

.899

.932

.613

Note.  Pattern/structure coefficients greater than |.5|  are presented in italics.

logic

 in layperson's terms. Lunneborg's  (2000) book provides a more techni-

cal and  comprehensive view.

The

  bootstrap

  can be

 conceptualized

  as a

 method

  of

 using

 the

 sample

data

  in

 such

 a

 manner  that

  the

 sample data

  are

  themselves used

  to

 estimate

the distribution of a large number of sample results  from  the population

(i.e., the sampling distribution).

 This

 process of the estimation pulling itself

up by its own

 bootstraps explains

 why the

  approach

  is

 named

  the

  bootstrap.

One way of

 explaining

  the

  technique, although somewhat

  s impl i f ied ,

begins

 by describing the creat ion of a mega-file created by copying the sample

data set of size n

 onto

 itself ( i .e . , concatenating)  a ve ry large number of times.

Then repeated resamples

 of

 cases,

 each of

 exactly

  size

 n

are

 drawn

 from  the

mega- f i l e ,

 and in each resample the results are analyzed independently.

The

  number

  of

 resamples

  is  u su a l ly  at

  least

  1,000, although 2,000  to

5,000 resamples

 are

 recommended.

  In a

 given resample,

 a

  given case might

be

 drawn

  not at

  all, once,

  or

  several times.

  And

  this case might

  be

 drawn

not at all, once, or several times in the next resample. After all the resamples

are drawn, mean (or median) parameter estimates (e.g., R

2

, factor pattern/

structure  coef f i c ients )  are computed across the resamples.

The

  standard deviations

 of the

 estimates

  are

 also computed. These

 are

empirical  (as

 against theoretically based) estimates

  of the

  standard errors

 o f

each of the

  parameter estimates.

There

  is

 only

  one  diff icul ty  in

  applying

 the

  bootstrap

  in

 multivariate

statistics such  as EFA:  When  more  than  one set of weights (e.g., factors)

can be

 computed,

  the

  weights across

  the

  resamples must

 be

 rotated

  to

  best

fit

  w i t h

  a common target matrix prior to computing averages and standard

errors.  This  is because in the first resample a construct might emerge as

Factor I, but in the next resample might emerge as Factor II. Factors must

I N T E R N A L

  KEPLJCABILfTY

  ANALYSES

  103

Page 101: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 101/185

TABLE

 9.3

First Four Eigenvalues From Four Resamples  n =

 30)

Factor

Resample

1

2

3

4

Mean

SE/SD

I

3.867

4.798

5.357

5.190

4.803

0.577

II

2.523

1.776

1.312

1.553

1.791

0.454

III

0.600

0.675

0.548

0.605

0.607

0.045

be rotated to

  bes t - f i t

 position so that averages of apples and apples are being

computed across

  the

  resamples.

Of

 course, these mechanics

 are not  d i f f i cu l t

  when they

  are

 automated,

and bootstrap applications across various mult ivariate analyses have been

reported

  and  recommended (Thompson, 1988, 1992a, 1995).  In EFA the

bootstrap can be used to

  inform

  the decision of how many factors  to  extract,

to  help evaluate  the  replicahility of the  structure, or both.

Factor Extraction Bootstrap Procedures

The

  bootstrap

  can be

 used

 as

  another procedure

  to

 decide

  how

 many

factors  to extract,  in addition to the  methods described in chapter 3. In this

heuristic example only  four resamples are drawn. This makes the  s u m m a r y

manageable, but of course is not a realistic description of an actual boot-

strap analysis.

As  reported  in Table  9.1,  in the first  resample,  the first  case drawn

w as

 Case  #1. In  each draw all the  data  for a  given case  are extracted  into

the

  resample.

  Case  #1 was

 also drawn again

  in

  Draw

 21.

In

  each

  resample  the  eigenvalues are computed.

  Then

  means  of the

eigenvalues

 and the standard deviations  can be derived. Because the resam-

ples are being used to estimate sampling distributions, these standard devia-

tions are actually empirically estimated standard errors (i.e., SDs of the

parameter estimates

  in the

  sampling distribution).

Table

 9.3 presents the first three of the eight eigenvalues

 from

 principal

components analyses

  of the

  data  from

  the  four

  resamples.

  The

  table also

presents

  the

  mean eigenvalues across resamples

  and the

  standard errors

 of

these estimates.

 For

 example,

 the

  mean eigenvalue across

  four

 resamples

 for

the  second  factor without  rotation  was  1.791 (SEj. =  0.454).  The  mean

eigenvalue across  four resamples

 for the

  third factor

 was

 0.607 (SE^

 =

 0.045).

F or  these results the extraction of two factors is suggested. If the third

eigenvalue mean

  had

  been  0.999,

  and the

  standard error

  had

  been  small

104

E XP L ORA T ORY

  A N D

  C O N F I R M A T O R Y

 FACTOR

  ANALYSIS

Page 102: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 102/185

bu t

 nontr iv ial ,  the  extraction  of the  third factor might have been warranted.

In other cases, even

  if an

  e igenvalue mean

  is

  slightly greater  than  1.0,

  a

relat ively

  large standard error would

  suggest

  that  not  much fai th  can be

vested

  in

  this eigenvalue consistently being greater than 1.0;

  here

  some

thought might  he  g iven  to not  extrac t ing  the  factor associated with this

re la t ively

  smal l and unstable e igenvalue .

Pattern Structure  Coefficients and  Standard Errors

Bootstrap estimation of factor pattern/stru ctu re coefficients and their

stand ard errors requ ires the spe cifi catio n of a common factor space or targe t

to  wh ich  all  resampled so lut ions  can be  rotated  to a  position  of  best  fit.

The

  s t r u c t u r e

 in the

  actual sample

 can be

  used

 as the

  target

  for

  this purpose.

Another

  choice  is a

  theoretically expected factor structure.

  In the

present example,

  the

  target matrix reflected

  an

  expectation that variables

P E R I

  throug h PER 4 wou ld have +1.0 coefficients on Factor I , and zeroes

on  Factor  II. The  target matrix also reflected  an  expectation  that var iables

PER5 through PER8 would have +1.0 coefficients on  Factor  II, and  zeroes

on

  Factor

  I.

Table 9 .4 presents  the v ar imax-rotated pat tern/s t ruc tu re matr ices  from

principal components analyses

  in the

  four

  resamples.  Note  that

  the first

factor

  inc lu des var iables PE RI through PER4

  in

  Resamples  2

  and #3 but

not

 #1 and #4.

Table  9 .5  presents  the  four  s t ruc tures each  rotated  to  best-f i t  posit ion

w i t h  the  target matrix. Table  9.6  summarizes the  means  of the  paramete r

est imates

 and

  their standard errors.

 Table 9 .6

 also reports

 th e

  mean parameter

est imates d iv ided

  by

  their respective standard errors.

The ratio of a paramete r estimate to i ts standard error (SE ) is very

important in statist ics, and so this ratio is given m any names.  These  names

include  critical  ratio, Wald statistic, an d  t.  W e  usual ly  expect these t val ue s

to be

  greater

  than

  roughly |2.0|

  for

 parameters that

  w e

  wish

  to

  interpret

  as

being nonzero.

Overal l ,

  the

  heurist ic results,

  if

 they

  had

  been real , would have sup-

ported

  a

 conclusion  that

  the

  fac tor s t ruc tu re

 w as

 reasonably replicable .

 This

is  reflected

  in the fact  that  the em pirica lly estimated standard errors for

large

  va lues

  in

  Table

  9.6 are

  quite small (e.g., .039, .009).

  These

  standard

errors indicate

  that  th e

  noteworthy parameters

  in the

  model

  do not

  vary

mu ch across various configu rations of the  people considered  in the  bootstrap

resamples.  If we mix up our

  sample cases

  in

  many

  different

  configurat ions

and

  always

  sti l l get the same basic factor structure, this would give some

hope that the

  s t ruc ture

 may

 also replic ate across

 the

  configurat ion

 of

 par t ic i -

pants  in a new  sample .

I N T E RN A L RE P L I C A B I L J T Y A N A L YSE S  1 05

Page 103: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 103/185

TABLE

 9.4

Varimax-Rotated Pattern/Structure  Matrix  From  Four  Resamples n

K

 = 30)

Factor

Variable

  ~i  ll~~

Resample  1

PER1 -.066 .921

PER2 -.067 .958

PER3 .347 .836

PE R4 .251 .808

PER5 .847 .219

PER6 .904 .031

PER7 .917 -.021

PER8 .797 .135

Resample 2

PER1 .966 .136

PER2 .964 .130

PER3 .927 .266

PER4 .759 .431

PER5 .128 .927

PER6 .344 .784

PER7 .182 .845

PER8 .159 .775

Resample  3

PER1  .869 .297

PER2

  .947 .127

PER3 .810 .493

PER4 .859 .292

PER5 .555 .670

PER6 .226 .899

PER7 .251 .882

PER8

  .228 .825

Resample 4

PER1 .067 .947

PER2

  .215 .933

PE R3 .485 .836

PER4 .541 .632

PER5

  .922 .203

PER6 .911 .228

PER7 .878 .148

PER8 .788 .329

MAJOR

  CONCEPTS

A ll  statis tical methods, inclu ding factor analysis , are susceptible  to the

influences  of three kinds of errors: sampling error, measurement error, and

model specification error.

  To

  address

  th e

  influences

  of

  sampling error,

  re -

searchers replicate resu lts wi th independ ent samples (i .e. , external replica-

/ 06  EXPLORATORY  A ND  CONFIRMATORY  FACTOR  ANALYSIS

Page 104: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 104/185

TABLE 9.5

Four

  Resample Solutions Rotated to Best Fit With Target

Factor

Variable

  I II

Resample

  1

PER1 .920 -.079

PER2 .957 -.080

PER3 .841 .335

PER4 .812 .240

PER 5 .231 .844

PER 6 .044 .904

PER7 -.008 .917

PER8 .146 .795

Resample

  2

PER1 .968 .125

PER2 .966 .119

PER 3 .930 .255

PER4 .764 .422

PER5 .139 .926

PER 6 .354 .779

PER7 .192 .843

PER8

  .168 .773

Resample

  3

PER1 .864 .312

PER2 .944 .143

PER3 .802 .507

PER 4 .854 .306

PER5 .544 .679

PER 6 .211 .903

PER7 .236 .886

PER8 .214 .828

Resample

  4

PER1 .950 .012

PER2 .944 .159

PER3 .863 .435

PER4

  .663 .503

PER5

  .256 .909

PER6 .282 .896

PER7 .199 .868

PER8

  .375 .767

tion). When external

 replication

 is

 impossible

  or

  impract ica l ,  internal rep l i -

cab i l i ty methods , such  as  cross-validation  or the  bootstrap, may be  usefu l .

Exploratory factor analysis cross-validation  methods can be conducted

by  randomly spli t t ing  the  sample  and  using  SPSS  to  factor analyze  both

data sets, and

 then

  us ing the Appendix B bes t - f i t rotation to q u a n t i f y degree

of factor

  fit

  across

  the two

  subsamples . However ,

  the

  basis

 for

  actua l fac tor

interpretation is a lways the  s t r u c tu r e  from  the  total  sample.  The  subsamples

INTERNAL REPUCABiL/TY ANALYSES

  10 7

Page 105: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 105/185

TABLE 9.6

Bootstrapped

 Estimates of Pattern/Structure Coefficients and Their

Standard Errors

Variable

PER1

PER2

PER3

PER4

PER5

PER6

PER7

PER8

Mean

.926

.953

.859

.773

.293

.223

.155

.226

Factor

  I

SD/SE

.039

.009

.046

.071

.152

.115

.095

.090

 

23.47

102.33

18.49

10.86

1.93

1.94

1.62

2.52

Mean

.093

.085

.383

.368

.840

.871

.879

.791

Factor

  II

SD/SE

.146

.096

.096

.102

.098

.053

.027

.024

 

0.63

0.88

3.99

3.62

8.60

16.45

32.58

33.09

Note.  Pattern/structure coefficients greater than |.4|

 are

  italicized.

are created only

  to

  gain insight regarding

  the

  character

  of the

  total data

set and  not for factor interpretation.

The EFA bootstrap (Thompson, 1988) is a particularly powerful inter-

nal replicability method. The bootstrap is so p o w er f u l  because it resamples

the  data in so many

 d i f fe rent

  combinations. However,  as a practical matter,

specia l i zed  software  must be used to conduct  these analyses.

10 8  EXPLORATORY AND CO NFIRMATO RY FACTOR

  ANALYSIS

Page 106: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 106/185

10

CONFIRMATORY  FACTOR AN ALYSIS

DECISION SEQUENCE

Although the basic  concepts  of factor analysis are roughly a century

old (see Spearman, 1904), more recent extensions of the  exploratory factor

analysis (EFA)

  concepts

  have created

  the

  basic methods

  for

 confirmatory

factor analysis (CFA;

  see

  Joreskog, 1969).

  Both  EFA and CFA are

  part

 of

the

  general linear model (GLM).

  Thus,

  many

  of the

  concepts  introduced

previously

  with regard

  to EFA

  apply

  in CFA as

 well. Indeed,

  an

  important

emphasis

  in

  this book

  is

 elaborating

  the

  similarities between

 EFA and CFA

as

 part

  of a

  single underlying GLM, while also highlighting

  th e

  differences

in the two  sets of  methods.

Of  course, the  basic difference in CFA is that one or more underlying

models (i.e., how many factors, which measured variables are

  thought

  to

reflect

  w hich la tent v ariab les, whe ther the factors are correlated )

  must

  be

specified

  even

  to run the

  analysis.

 There

  are

  also statistical analyses

  that

are possible

  in CFA but

  impossible

  in EFA

  (e.g., allowing error variances

to be

  correlated).

And  some procedures that are routine  in EFA, such as  factor rotation,

are  irrelevant in CFA. This is because the a priori mode ls thems elves typic ally

specify  simple structure,

  by

  constraining certain factor pattern

  coefficients

to be

  zero while

 freeing

  other  pattern

  coefficients  to be

  estimated.

Confirmatory

  factor analysis

 is a

  very important  component  within

  a

broader class

 of

  methods called

  structural equation modeling

  ( S EM) ,

 or

  some-

109

Page 107: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 107/185

t imes

 covariance  structure analysis.

 CFA  specifies  th e

  "measurement models"

delineating   how  measured variahles reflect  certain latent variables. Once

these measurement models a re  deemed satisfactory, then  th e  researcher  can

explore path models (called structural models)  that  link  th e  latent variables.

Structural

  equation modeling is beyond the scope of the present vol-

ume.

  Excellent book-length treatm ents of SEM are availab le elsewhere (see

Byrne,  1994, 1998, 20 01 ). Cha pter-length expositions

  are

  also available

(see Thompson,  2000c). Schumacker

  and

 Marcoulides (1998)

  and

 Marcou-

lides  and  Schumacker (2001) describe recent extensions  of

 these

  methods.

Suffice  it to say that  even w hen the researcher is app lyin g the wid er

range of analyses possible with SEM , CFA mod els are often a critical start ing

place for such analyses (Thompson,  2000c). It makes  little  sense to relate

constructs w ith in an SEM m odel if the factors

 specified

  as part of the model

are not worthy of further  attention.

CONFIRMATORY   FACTOR  ANALY SIS VERSUS EXPLORATORY

FACTOR  ANALYSIS

In EF A, all parameters im plicit in a factor model m ust be estimated .

For example, if there are 9 measured variables, and 3 princip al components

are

  extracted  and  then  rotated  to the  promax cr i ter ion,  then  27  ( 9 x 3 )

pattern coeff icients , 27 ( 9 x 3 ) s tructure coeff icients , 9 comm unali ty

  coeffi-

cients

  h  ; or

  their corresponding uniq uenes ses estimated

  as u = 1

 —

  h" ) ,

and 3  factor correlation

  coefficients  ( r ]

X

n , r\x\\\,  i i i x i n )

  are  estimated.

Of  course, because  Pg

 x

 3 R^ x 3

  =

  S

y x

 3  given  P, if  either  the  factor

correlation matrix (R ^

 x

 3) or the  factor structure m atrix (8 9

 x

 5) is m a themati -

cally

 determined,  so is either R or S, and  thus  in actua lity only  30 parameters

(2 7

  + 3) are

 estimated. N ote

  that  th e

  uniquenesses

  are

  also mathematically

determined  once

  th e

  pat tern

  and

  structure matrices

  are

  estimated.

In   CFA,  th e  researcher  can  "constrain"  or  "fix" certain param eters  to

mathematically "permissible" values (e.g., a variance m ay be  constrained  to

equal  any  positive number;  a  correlation,  r, may be  constrained  to  equal

-1, +1, or any

  number

  in

  between), while

  at the

  same time "freeing"

  the

use  of the  input data  to  derive estimates  of  other model parameters (e.g.,

factor  p attern coefficien ts, factor varian ces) . For example, the researcher

might constrain variables VI, V2, and V3 to  reflect  only Factor  I, variables

V4, V5, and V6 to   reflect

  only Factor

  II, and

  variables

  V7, V8, and V9 to

reflect  only Factor II I, w ith the th ree facto rs allowed to be correlated.

In  EFA, th e  researcher m ay expect certain coefficients,  but these expec-

tations cannot  be  incorporated into  the  analysis. In CFA a  researcher must

110  EXPLORATORY AND

  CONFIRMATORY

  FACTOR  ANALYSIS

Page 108: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 108/185

declare as input into the  analysis one or more

 specific

 models, each  containing

some "fixed" and some "freed" parameters.

The original computer  software  for CFA (and S EM ), such as LISRE L

(i.e., analysis of Linear Stru ctura l RELationships; Joreskog & Sorbom, 1984),

required

  th e  researcher  to  input  these  model expectations using ma trix

algebra.  A  thorough knowledge  of both  th e  uppercase  and the  lowercase

Greek alphabet was also required

Modern  software,  such  as AMOS  (Arbuckle  &  Wothke,  1999)  an d

EQS (Bentler, 1995), instead uses a graphics-oriented user interface. The

researcher's expec tations are declared using the software to dr aw a diag ram

representing

  th e

  input model.

  The

  results

 are

 then  reported

  by the  software

in  text  form (i.e.,  sets of coefficients  printed  fo r  different  matrices) , but  also

as

  an

  output diagram with estimates printed

  on the

  input model diagram.

By

  tradition, in these v arious compu ter programs measured variables

are

  represented

  by

 squares

 or

  rectangles. Latent variables

 are

  represented

  as

circles or ovals. Regression paths or, in the factor  analysis  case, pattern

coefficients,  are

  represented

  by

  one-headed arrows. These

  are

  drawn

  from

factors

  to measured variab les to  reflect  the premise that  the latent variables

are

  in

  fact

  an

  underlying influence

  on the

  manifestation

  of the

  factors

  in

the  form of scores on the  measured variables. Correlations (and cova riances)

are represented as two-headed arrows draw n to connect  either m easured or

latent variables in pairs for which correlations or covariances are  freed  to

be   nonzero  and are  estimated  in the analysis.

Figure

  10.1 presents the inp ut graphic model declaration implied by

an EFA   involving nine measured variables  and  three correlated factors.

Table 10.1 presents

 a

 sequential count

  of the 30

 param eters that

 a re

 estimated

as

  well as a sequential count of the parameters mathematically determined

once  th e  estimates  are in  hand.

Figure  10.2 presents  th e  related CFA  inpu t model presum ing that each

of  the

  nine measured variables

  is

  associated with only

  one

  factor,  three

measured

  variab les per factor. The absence of paths between selected pa irs

of

  measured and latent variables (e.g., between V I and Factor II, between

V2 and Factor II) implies that these  factor pattern coefficients have been

constrained

  to be

  zero.

  This

  model declares simple structure

 on its  face,

because

  no

  measured variable

 is

 allowed

  to

  function

 as an

  indicator

 fo r

 more

than on e

  factor. Thus,

  if

 model

 fit is

 adequate ,

 no

  rotation

  will  be

 necessary.

Table  10.2 presents

  a

  sequential count

  of

  parameters estimated,

  no t

estimated, and implied by this model. However, one difference between

EFA

  and CFA  arises  in  this presentation.  Note  from  Table 10.2 that  th e

nine

  uniquenesses

  u

  ) are counted as estimated in the CFA analysis. In

the EFA

  analyses these were taken

  as

  implied,  because  once

  th e

  pat tern

an d

  structure

 coefficients  are in

 hand,

  the

  communal i ty coefficients  h

  ) are

CONFIRMATORY FACTOR

  ANALYSIS

  DECISION  SEQUENCE

  I 11

Page 109: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 109/185

factor \Y

Figure

  10 1

Exploratory factor analysis factor model with nine measured variables

an d three correlated factors

computed  by simple multiplications and  additions along  a given  row of the

factor  matrix,  an d

  then

  uniquenesses  ar e  computed  as u

2

  = 1 —

 h

2

.

Figure

 10.3 presents yet another factor model for this research problem.

This

  figure

  illustrates

  some

 of the

  model elements  that

  are not

  possible

 in

EFA.

 First, the uniquenesses or error variances involving el and  e2 have

been allowed

 to

  covary

 or be

  correlated

  in

  this model.

  No

  error variances

can be correlated  in an EFA model, but in CFA the  correlations of various

112

EXPLORATORY

 AND

  CONFIRMATORY FACTOR  ANALYSIS

Page 110: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 110/185

TABLE

  10.1

Counts of Parameters  Estimated

  and

  Implied)  by the Figure

  10.1

Factor  Model

Pattern Structure

Variable

V1

V 2

V 3

V4

V5

V 6

V7

V8

V9

Factor r

I

II

III

 

1

2

3

4

5

6

7

8

9

28

29

II

10

11

12

13

14

15

16

17

18

30

M l

19

20

21

22

23

24

25

26

27

I

 31)

 32)

 33)

 34)

 35)

 36)

 37)

 38)

 39)

II

 40)

 41)

 42)

 43)

 44)

 45)

 46)

 47)

 48)

III

 49)

 50 )

 51 )

 52)

 53)

 54)

 55)

 56)

 57)

u

2

 58 )

 59 )

 60)

 61 )

 62)

 63)

 64)

 65)

 66)

Note.  Coefficients  mathematically

 determined

 and  thus

 implied

 once estimates are in

 hand

 are

 noted

 in

parentheses.

pairs

  of

  error variances

  can be

  estimated,

  as

  deemed necessary

  by the

researcher.

Second,

 only  tw o factors are allowed  to be correlated,  and  this correla-

tion

  is

 estimated

  as part  of the

  model.

  In  EFA, factors must  be  either  all

correlated or all uncorrelated.

Third,

  an  "equality" constraint, designated  by the  same letter (here

a)

  b eing placed on the paths for the factor pat ter n coeff icients for three

measured variables on  Factor III, h as  been imposed.

  This

  means

  that

  the

input data  will  be consulted to estimate

 these three

 pa ttern coefficients, but

th e  estimate  has  been constrained such

  that

  th e  same single number must

be used for all three of these pa ttern coef ficients. Eq ual ity constraints are

not  possible within EFA.

Equali ty  constraints have several possible uses

  in

  EFA,

  but one use is

to

  test

  a

 theoretical exp ectation

 that all

 measured variables

 on a

 given factor

measure

 that

 factor equa lly well.

 Fo r

 exam ple, when scoring

 on

  test

 subscales

is done

  by  summing item scores  on a  given subscale,  the  implication  is that

th e  items  on the  subscale measure  th e  underlying

  construct

  equally well.

This  premise  can be directly tested  in  CFA.

In CFA one may  also (and should ) test  rival  models.  Equality  con-

straints might

  be

  imposed

  in one

  model; however,

  in a

 riv al model

  all  freed

pattern coefficients might  be  estimated with  no  equality constraints.

  This

rival  model might  reflect  a belief that  the  test items do  reflect  the  intended

CONFIRMATORY FACTOR ANALYSIS

 DECISION SEQUENC E

113

Page 111: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 111/185

Page 112: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 112/185

TABLE

 10.2

Counts

  of Parameters Estimated, Not Estimated, (and Implied)  by the

Figure 10.2

  Factor

 Model

Variable

V1

V2

V3

V4

V5

V6

V7

V8

V9

Factor  r

I

II

III

 

1

2

3

10

11

Pattern

II

  III I

  (22)

  (23)

  (24)

4  (25)

5

  (26)

6

 

(27)

  7

(28)

—   8 (29)

— 9 (30)

12   —

Structure

II

 31 )

 32 )

(33)

(34)

(35)

(36)

(37)

 38 )

(39)

III

 40 )

 41 )

 42 )

(43)

(44)

 45 )

 46 )

 47)

 48 )

u

2

13

14

15

16

17

18

19

20

21

Note.  Coefficients mathematically determined and thus implied once estimates are in hand are noted in pa-

rentheses. Fixed parameters are designated by dashes ( — ). Here 18 factor pattern coefficients are con-

strained  to be zeroes. And the three factor variances are  fixed  to equal  1

 .Os,

 thus making  the factor covari-

ances equal factor correlation coefficients, because  for example r . „ = COV,

 x

„/  (1.0 x  1.0).

  ival  Models

Like SEM,  CFA  involves testing  the fit of  models  to  data.  When

models fit well, as reflected in various statistical fit indices, some researchers

erroneously  infer  that

  the

  correct model

  has

  been  specified,

  and

  that

  in

some sense

  the

  model

  is

 proven.

In  fact,  it is conceivable  that several models may fit a given data set.

As  Thompson  (ZOOOc)  noted, "the  fit of a  single tested model  may  always

be an

  artifact

 of

 having tested

  too few

 models"

  (p.

 278).

 Thus it is

 desirable

to test the fit of several  r iva l  models when conducting a CFA.

The fit of a  preferred  model  is more  impressive  when that  fit  occurs

in the  context  of  testing several  r ival  models, especially when some of the

rival models are theoretically plausible and are not articulated so as to create

straw  person arguments.  The fit of a  model  is also more impressive when

the

  tested model

  is

 strongly

  disconfirmable.

Disconfirmability

  is partially a function  of the  degrees  of freedom  fo r

the model test. Degrees of freedom indicate, given the presence of one or

more  sample  statistics  (e.g.,  the mean of the  sample  data,  the r  between

two  measured variables), how  many pieces  of  information  in the  analyses

remain

  free to

 vary

 after the

 estimate

 is in hand. For

 example,

 in the

  presence

of  knowledge that the sample mean is 3, given any  two scores in the data

set

 (1, 3, 5), the researcher can always accurately determine the third score.

CONFIRMATORY FACTOR

 ANALYSIS

 DECISION

 SEQUENCE

  115

Page 113: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 113/185

Figure  10.3.  Confirmatory factor analysis CFA) factor model with three model

features unique

  to

  CFA.

For

  this problem

  th e

  degrees

 of

 freedom

  is a

  function

  of the

  sample

  size,  n,

and

  equals

 n — 1.

Tahle  10.3 presents the c om puta tions for stat istic al significan ce testing

in any r or

  m ult ipl e regression

  (R )

  analysis involving

 p

 pre dictor variables

(e.g., for the

  bivar iate correlation,

  p =  1). There  are two

  related situations

in   which

  ^ C A L C U L A T E D

  (and  the  corresponding

  ^ C A L C U L A T E D

  also)  is unde-

116

EX PLORAT ORY

 AN D  CONFIRMATORY FACTOR

  ANALYSIS

Page 114: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 114/185

TABLE

 10.3

Statistical

 Significance Calculations

  for  r

2

  and R

2

 Problems

Source

Model

Unexplained

Total

Sum-of-

squares

A

B

C

Degrees  of

freedom

D

E

F

Mean

square  ^ C L C U L T E D

G  J

H

I

r

2

/

ft

2

K

Note.  D = p (i.e., the number of

 predictor

 variables). F = n -1 .E = / ? - 1 -p. G = A / D . H = B / E . I = C/

F. J = G/ H. K = A / C .

fined

  and a statistical significance test cannot be conducted.  First, regardless

of

 sample

 size, if the  sum-of-squaresuN^xpLAiNEO  '

s

 zero

 (and correspondingly

the r

2

  or R

2

  is

  100%),

  the F

  cannot

  be

  calculated, because

  the

  mean

s q u a r e u N E X P L A i N E D

 w

'"

  De zero

>

 an

J  division by zero is mathematically imper-

missible.

Second,

  whenever

  the

  degrees

  of  freedomMODEL

 equals

  the

  available

degrees  of freedom (i.e.,  fo r  these analyses

  n —

 1), the

  sum-of-squaresuNEX-

P L A I N E D

 W

'1I  always

  be

  zero.

 F or

 example,

  if X and Y are

  both

  variables (i.e.,

SD

X

  > 0 and SD

Y

  > 0), and n = 2, r

2

  must be  100%, and the sum-of-

s q u a r e s u N E x p L A i N E D

 must

 ^Y

 definition both

 be

 zero. Similarly,

 in a

 regression

analysis

  involving an y  tw o predictors,  the R

2

 wil l  always  be  100% whenever

n = 3.

Clearly, an R" of  100% is  entirely unimpressive when  10  predictor

variables  are  used  and n is 11 However,  an R of  100% would  be  very

impressive (even though the result still could not be tested for statistical

significance )  if n =  1,000,  because this regression model would be highly

disconfirmable  given

  that

  th e  residual degrees  of  freedom

UNEX

pLAiNEn

equaled 989.

The

  residual degrees

 of

 freedom

 are an

  index

 of the

  potential discon-

firmability  of  any model. The good fit of models is more impressive when

degrees of freedom

UN E

xpLAiNED

  ar

e

  barge,

 because as a function of the residual

degrees

  of

  freedom

  the

  model

  is

 parsimonious

  and yet has

  good explana-

tory  ability.

In CFA  (and SEM), however,

  total

  degrees of freedom for tests of

model fit are not a function of the sample

  size,

 n.  Instead, the total degrees

of

  freedom

  in CFA are a

  function

  of the

  number

 of the

  unique entries

  in

the covariance matrix, which is a function of the number of measured

variables  (v) in the analysis. The total degrees of freedom for

  these

  tests

equals   [ < v  v +  1)] / 2). For  example,  in a  study involving  i> = 9 measured

variables,

  th e

  available degrees

  of

  freedom equal

  ([9 (9 + 1)] / 2) =

([9  (10)]

 / 2) = 45.  This

  corresponds

  to

  evaluating

  9

  variances

  and 36

covariances (i.e.,  there  are 9 + 36 = 45 unique entries in the associated

variance/covariance matrix).

CONFIRMATORY

 FACTOR ANA LYSIS  DECISION

 SEQUENCE  117

Page 115: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 115/185

In CFA (and SEM), each parameter   e.g.,  factor pat tern  coefficient,

model error variance,

  factor

 co rrelation)  that  is estimated

 "costs"

 one degree

of

 freedom.

 This

  means that each a nd every  model involving nine measured

variables and

  est imation

 of

 an y

 4 5

 parameters will always perfectly reproduce

or

 fit the

  data.

The disconfirmable and/ or plausible models for a CFA are context-

specific,  and

  some such models

  can

  only

 be

 developed

  by

  consulting theory

and related em piric al research. How ever, several rival CF A models may

often be  useful

  across

  a  fairly

  wide array

  of

  problems:

1 .  Independence  model.  This

  model

  specifies

  that

  the

  measured

variables

  are all perfectly uncorrelated with  each other,  and

thus

  tha t

  no

  factors

 are

  present.

  The

  model estimates only

the mo del specifica tion varianc es of each of the meas ured

variables.

 This  model

  is

 d isconf irmable

 but  usually not  plausi-

ble.

  H owever, the stat ist ics qua ntifying the fi t of this model

are

  very  useful

  in

  serving

 a s a

  baseline

  fo r

  evaluat ing

  the fits

of  plausible rival models.

2.  One-factor  model.

  This model

  is

  disconfirmable,

 but  usually

not p lausib le for most research situations. Aga in, how ever,

even

  if

 this m odel

  is not

  theoretically plausible,

 the fit

 stat ist ics

from  this model test can be

  useful

  in cha racte rizing the degree

of

 sup eriori ty

 of a  rival

  mul t i fac tor  model .

  And it is

 desirable

to

  rule

 out the fit of this model even when m ultifa ctor models

are

  preferred,

  so

  that support

  for the

  mul t i factor model

  is

then

  stronger.

3.

  Uncorrelated

  factors  model.  In EFA, as  noted  in chapter 3,

orthogonal rotat ions

 are the

 default

 an d  usually

 provide simple

structure.

  However, in CFA, correlated  factors  are  usually

expected

  and

  almost

  always

  provide

 a

  better

  fit to the

  data.

However ,  it is

 helpful

  to  quantify  th e  degree  of superiority of

model

  fit for a

  correlated versus uncorrelated factors model.

Estimating

 the factor correlations for /fac tors costs (/ [f - 1]) /

2  degrees of freedom, for exam ple, (3 [3 - 1] ) / 2 = (3 [2] ) /

2  = 3).  Uncorrelated factor models  ar e  more parsimonious

when there are two or more

  factors.

  The quest ion is whether

the  sacrifice  of

 pars imony (and

 of

  related

  disconfirmability) is

worthwhile in service of securing a better model fit .

Model Identification

A

  model

  is said  to be

  "identified" when,

  for a

  given research problem

and data set,

  sufficient

  constraints are imposed such  that  there is a single

118

  EXPLORATORY

  AN D

  CON FIRMATORY

  FACTOR  ANALYSIS

Page 116: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 116/185

Page 117: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 117/185

apples  to  apples  as to how  well each m easured varia ble reflects  th e  underly-

in g

  construct.

The

  selection

  of

  scaling

  for the

  latent variables

  will  not  affect  th e

model fit statistics. The scaling decisions are basically arbi trary and can

reasonably  be  made  as a  function  of the  researcher's s tyli stic preferences.

These are not the  only conditions necessary  to  achieve model  identifi-

cation.

 When a single measured varia ble is posited to reflect multip le un de rly-

in g

  factors, evaluating model identification becomes technically

  difficult.

Bollen (1989)

  and

  Davis  ( 1993 )  provided more details.

Fortunately,

  modern computer software

  is

  relatively sophisticated

  at

dete cting model und erid ent ific atio n. Most packages (e.g., AM OS) provide

diagnostics  such

  as

  suggestions

  of how

  m any more param eters need

  to be

"fixed" or constrained , and some guidan ce as to wh ich para me ters in particu-

lar

  are good  candidates  for achieving model identification.

Matrix  of Association  Coefficients  to  Analyze

As noted in chapter 3, in EFA there are num ero us choices  that  can

be made regarding the  matr ix  of association coefficients to be analyzed.

Popular EFA

  choices include

  th e

  Pearson

  produc t— moment

  r  matr ix ,

  th e

Spearman's

  rho

  matr ix ,

  and the

  variance/covariance matrix.

 A s

  noted

  in

chapter  3, the

  default

  choice  in  statistical  packages  is the  Pearson  r matr ix ,

and

  th is

  is the

  selection  used

  in the

  vast preponderance

  of

  published

EFA

 studies.

The

  statistical

 software

  for CFA

  supports

 an

  even larger,

 dizzying

 array

of

  choices.

  These

  include  coefficients  with which many readers

  may be

unfamiliar,  such a s the po lyserial or the polychoric cor rela tion coefficients .

But

  th e  default  choice in most CFA software is the covariance m atrix .

This

  matr ix

  has

  covariances

  off the

  main diagonal. Remember that

  T X Y

 =

COVxv / [(SD

X

) (SD

Y

)], and  that  therefore COV

XY

  = r

X Y

 (SD

X

) (SD

Y

). The

correlation

  is a

  "standardized" covariance from which

  the two

  standard

deviations have been removed

  by

 div is ion .

The  variance/covariance matrix has  variances on  the  main diagonal.

Because the  covariance of a  var iable with  itself  equals  the  var iance  of the

variable,  it is

  more common

  to

  hear researchers  refer

  to

  th is  matrix more

simply  as the  covariance matrix.

The  computation  of

 both

  th e  Pearson  r  and the  covariance matrices

presumes that  th e  data  are  inte rval ly scaled. This means  that  the  researcher

believes that

  th e

  underlying measurement  tool yielding

  the

 scores

 has

 equal

units

  throughout. For example, on a  ruler  every inch (e.g.,

  from

  1" to 2",

from

  11" to

  12")

  is the

  same amount

  of

  distance

  (or

  would

  be if the

  ruler

were

 perfect). Note tha t  the a ssumption is that  th e  ruler  has equa l intervals,

and not that the scores of different  people are all eq uid ist an t.

120

  EXPLORATORY

  A N D

  CONFIRMATORY FACTOR

  ANALYSIS

Page 118: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 118/185

In psychology sometimes the interval scaling of the measurement is

more debatable when scores on  abstract constructs  are being estimated. For

example,

  if

 Likert scales

  are

  used

  to

  measure attitudes,

  and 1 =  never,

  2 =

often,  3 =  very

  often,

  and 4 =  always,  most researchers would doubt that

th e

  data were intervally (continuously) scaled. However,

  if 1 =

  strongly

disagree, 2 =  disagree,  .3 =  agree,  and 4 =  strongly  agree,  many (not all)

researchers would deem  the scores intervally scaled, or at least approximately

intervally scaled.

Of  course, whenever

  there

  is some doubt regarding the  scaling of data,

or regarding the

 selection

 of matrix of associations to  analyze, it is thoughtful

practice

 to use

 several reasonable choices reflecting

 different

  premises.

 When

factors are invariant across analytic decisions,  the  researcher  can vest greater

confidence  in a view

  that

  results are not  methodological artifacts.

In CFA, tor some complex statistical reasons, it is almost  always prefera-

ble

  to use the

  coyariance matrix, rather

  than  the

  Pearson

  r

  matrix,

 as the

matr ix  to be

  analyzed. There

  are

  some conditions

  in

  which

  the

  Pearson

p r o du c t— m o m e n t  matrix may he correctly used (see Cudeck,  1989).  How-

ever, the covariance matrix will always yield correct  results, as long as the

data are appropriately scaled and distributed.

Multivariate

  Normality

As

  noted

  in

  chapter

  7, the

  shapes

  of the

  data distributions being

correlated

  affect  the

  matrices

 of

 association actually analyzed

 in

 EFA. Tradi-

tionally, considerable

  attention

  is paid to

  these

 issues in

  Q-technique

 EFA

applications, while less

 attention

 is focused on these

 concerns

 in R-technique

EFA applications, even when  the  Pearson  product-moment r matrix is the

basis for analysis.

However, many CFA results are particularly affected  by data distribu-

tion shapes. Certain statistical estimation theories presume multivar iate

normality, especially with regard to the standard errors for the estimates of

model parameters.

  And

  some

  CFA

  (and SEM) model

  fit

  statistics also

presume normality.

Even univariate normality  is not as straightforward a s many researchers

erroneously believe. Misconceptions regarding univariate normality occur,

in part, because many textbooks only present a single picture of a univariate

normal distribution:

 z

 scores

 that

 are normally distributed (i.e., the  "standard

normal" distribution).

There is only one univariate normal distribution for a given mean and

a  given standard deviation.  But for every  different

  mean

  or  every  different

standard deviation, the appearance of the normal distr ibution changes.

  Some

normal distributions when  plotted  in a histogram have  the classic bell shape

that many associate with  the  normal  or  Gaussian curve.

C O N F I R M A T O R Y FACTOR

 ANALYSIS

  DECISION SEQUENCE  121

Page 119: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 119/185

But bells come

  in

  infinitely  many shapes. Some bells appear  fairly  flat

and wide, whereas others  appear tall and narrow. Of course, one  universally

true statement about normal dis tribution s with  different  appearances is that

they  all are  unimodal  an d  symmetrical (i.e., coefficient  of  skewness  = 0)

and

  have

  a

 coefficient

  of

  kurtosis

 of

  zero.

Bivariate

  normality  is

  even m ore complica ted. Because there

  are now

scores on two va riables for each person, a three-dimen sional space is requ ired

to   represent  th e  distribution, an d  three-dimensional objects must be  located

in this space  to  represent  the  scores  of  different  people.  The  objects used

might

  be

  little

  BBs or

  ballbearings,

  one for

  each person.

An X-axis and a perpendicular  Y-axis  might be drawn on a piece of

paper.  A

  vertical axis extending through

  the X,Y

  intercept might

  be

  used

to measure the frequencies at different score pairs. For example, if

 C olleen

and  Deborah both  had  scores  of  1,1, their  BBs would  be  stacked  on top of

each

  other

  at

  this particular score pair coordinate.

If

 both

 variables

 are

 z scores

 and

 normally distributed,

 the BB s in

 bivari-

ate data w ould create  a literal three-dim ensio nal b ell shape.  If we traced around

th e outer edges of the BB s on the  paper, a circle wou ld be drawn on the  paper.

If

 we made  a slice through  the BBs anywhere  up the  frequenc y axis, parallel

to but moving away

 from

  the piece of paper, concentric and n arrow ing circles

would be

 defined un til

 we

 reached

  the top of the

  bell.

 If we

 sliced th rou gh

  the

BBs with a

 knife,

  in any location only with the restriction that the knife cut

al l

  the way

 down through

  th e

  vertical frequenc y axis, every slice would

 yield

a univariate standard normal distribution.

If

  the Y scores were more spread out than the X scores, the signa ture

or foot of the bell defined by trac ing on the paper a round the BB s wou ld

be an ellipse rather  than  a circle. If we sliced upward on the vertical axis,

but  par allel to the paper, every slice would  yield  a concentric ellipse until

we  reached the top of the bell. If we sliced throug h the BB s w ith a kn ife,

in any

  location only with

 t he

  restriction

 that  th e

  knife

 cut all the w ay

 down

through

  th e

  vertical frequency axis, every slice would  yield

  a

  univariate

normal distr ibu tion , albeit potentially slices  that  were narrower or  wider

than  each  other.

Univariate normality  of  both variables is a necessary  but not  sufficient

condition

  fo r

 b ivariate normality. B ivariate normality

 of all

 possible pairs

 of

variables

 is a

 necessary

 but not

  sufficient  condition

  fo r

 mult ivariate normali ty.

Of  course, multivariate normality requires that  more  than  three dimensions

are conceptualized. These are not  easy dynam ics to picture or to understand.

Henson  (1999) provided a very

 useful

 discussion of these  concepts.

There a re primari ly tw o ways

 that

  researchers  can  test  th e multivariate

normality  of  their data.  First,  if more  than  25 or so  measured variables are

involved, the grap hical techniqu e (not involving a statistical significance

test) described

  by Thompson

  (1990a)

  can be

  used.

J

 2 2

  EXPLORATORY

  AND

  CONFIRMATORY FACTOR ANALYSIS

Page 120: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 120/185

Second,

  a statistical test of the mu ltiva riate c oefficient of kurto sis,

using  the null hypothesis  that kurtosis equals

 zero,

  is available in com monly

used CFA/SEM  computer programs. These  computer programs also provide

a "critical ratio"  t

 or

 Wald statistic)

 of

 this p aram eter estimate

  to its

 standard

error.

  Researchers

 hope that

  th e  mu ltivaria te kurtosis

 coefficient

  is close  to

zero,  and  that  the null hypothesis is not r ejec ted.

What  can he

 done

 if the

  assumptions

 of

 m ultivaria te normality  cannot

he met? One  choice  is to use a parameter estimation theory, described in

a  subsequent section  of  this chapter,  that does  not  require this assumption.

H owever, "distribution-free" estimation theory does require even larger sam -

ple

  sizes  than

  do

  more commonly used theories.

A second alter nat ive is to create item "parcels" by bund ling together

measured variables.

  For

  example,

  if

  eight measured variables

  are

 presumed

to measure Factor I, the data can be reexpressed as four  measured variables.

For example,

  each

  person's score  on the  measured variable

  that

  is  most

skewed  left  is

 added

  to the

  person's  score

  on the

  measured variable that

  is

most skewed right,

  the

  score

  on the

  measured variable that

  is

  second-most

skewed  left  is

 added

  to the

  score

  on the

  measured variable

  that  is

  second-

most skewed right, and so forth. However,  this  strategy is subject to the

general limitations

  of

  parc eling methods (cf. Bandalos

  &

  Finney, 2001;

Nasser

  &

  Wisenbaker, 2003).

Outlier  Screening

It  is well-known  that

  outliers

  can

  grossly

  influence statistical results for

a  data set, even  if the  number  of  outliers  is  small  and the  sample  size  is

large.

  Outliers can influence CFA results (see Bollen, 1987)

  just

  as they

can  other  analyses.

Yet  misconceptions regarding outliers abound.  Some  researchers incor-

rectly  believe that outliers have anomalous scores on all possible variables

with respect

  to all

 p ossible statistics.

  One can

  almost visualize evil, aberrant

people whose foreheads are (or should be) branded with large capital Os.

Outliers  are  persons with aberrant  or  anomalous scores on one or  more

variables

  with respect

  to one or

  more statistics.

  But

  most outliers

  are not

outliers

  on all variables o r w ith respect  to all statistics. Consider  th e  scores

of

  Tom, Dick,

  and

  Harry

  on the

  variables

  X and

  Y :

Person

  X Y

Tom 1 2

Dick

  2 4

Harry

  499 998

CONFIRMATORY FACTOR ANALYSIS DECISION

  SEQUENCE 1 23

Page 121: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 121/185

Page 122: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 122/185

TABLE

 10.5

Covariance

  and Correlation Matrices for Eight

  Measured  Variables

 n= 100

 Faculty)

Measured  variable

Variable

PER1

PER2

PER3

PER4

PER5

PER6

PER7

PER8

PER1

1.847

1.768

1.340

1.681

0.921

0.962

0.986

1.344

PER2

0.844

2.374

1.278

1.787

0.816

0.797

0.835

1.468

PER3

0.653

0.549

2.280

1.488

0.992

1.026

1.132

1.176

PER4

0.825

0.774

0.657

2.248

1.043

1.019

0.967

1.338

PER5

0.328

0.256

0.318

0.336

4.274

3.543

3.375

3.322

PER6

0.349

0.255

0.335

0.335

0.844

4.122

3.231

3.150

PER7

0.388

0.289

0.400

0.344

0.872

0.850

3.506

2.935

PER8

0.461

0.444

0.363

0.416

0.749

0.723

0.730

4.605

Note. Covariances are not  italicized;  Pearson

  rs

  i.e.,

 standardized covariances)  are italicized.  Pearson

 r

C O V x v

 /

  SDx

 x SCM

  e .g. , 0.844

 =

  1.768

 /

  1.847

5

 x

 2.374

5

)). COV

XY

 =

 r

m

  x SD

*

  x  S

f  <

e

  9 >

  1

-

768

  =

0.844

  x

  1.847

s

  x

 2.374

s

).

distances

  can

  also

  he

  obtained within  SPSS

  by

  performing

  a

  regression

listing the measured variables as the predictor variables in a mult iple regres-

sion analysis  and  using any  other  variable as the  dependent variable. For

this

  example,

  if

 participant

  ID

  number

 was

 used

  as the

  pseudo-dependent

variable,  the  SPSS

 syntax would

 be:

regression

 variables=id perl

 to

 per8/

descriptive=mean stddev corr/

dependent = id/eiiter perl to per8/

save=mahal mahal)

sort cases by mahal D) .

print formats mahal F8.4) 

list variables=id perl to per8 mahal/cases=999 

However, even when we know that a given  person  is an  o u t l i e r  on a

given  set of variables  in  regard  to

  p a r t i c u l a r

  s ta t is t ic s ,  it is not

  a lways

  clear

what should  be

 done

  with  the  person's scores. If the  researcher interviews

the  outlier and  determines that  b izar re  scores were intentionally generated,

then

  deleting

  the

  person's scores

  from

  the

  data

  set

  seems only reasonable.

But when

  we

 cannot

 determine

  the

  reason

 for the

  score aberrance,

  it

 seems

less reasonable

  to

  delete scores only because

  we are

  unhappy about their

atypicality

  with respect  to  deviations

  f rom

  the  means.

One  resolution  refuses  to represent  all analytic

 choices

 as being either /

or. The  analysis might  be conducted  both with the outlier candidates  in the

data  and  with

  them

  omitted.  It  interpretations  are  robust across  var ious

decisions,

  at

  least

  the

  researcher

  can be

 confident

  that  the  resu l ts  are not

artifacts

  of

 methodological choices.

C:ONFIRMATORY

  FACTOR

  ANALYSIS  DECISION  SEQUENCE

125

Page 123: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 123/185

It  is

 also critica l

 to

 take

 into account th e

  analytic context when making

these  decisions.  The  context  is  very different  if one is  doing research  in a

previously

  unexplored area versus an arena in which n ume rous studies have

been  conducted.  When

  other

  studies

 have

  been

  done

  th e  researcher  can

feel

  more comfortable

  that

  outliers are not c omp romising gene ralizab ility,

i f   results replicate across stud ies. By the same

  token,

  researchers may be

somew hat less concerned about outliers when they

  are

  also conducting

interna l replicability analyses.

A s Cohen, Cohen,  Wes t ,  and  Aiken (2003) argued,

The

  extent

  of

  sc ru t iny

  of these

  statistics

  depends  on the

  nature

  of the

study. If the  s tudy is one of a series of replications, inspection  of  graphical

displays for any

  obvious

  outl iers

  is

  normally

  sufficient.  Consistency of

the  findings across replications provides assurance that the presence of an

outlier  is not responsible for the results

  [italics added].  On the

  other hand,

i f

  the

  data

  set is

  un ique

  and

  un l ike ly

  to he

  replica ted (e.g.,

  a

  s tudy

 of

40  ind iv idua ls wi th  a  rare

  medical

  disorder), very careful scrutiny  of

th e

  data

  is in

  order, (pp.  410-411)

Parameter stimation

  Theory

Conventional  graduate courses  in applied statistics teach ana lyses such

as

  t

  tests, analysis

 of

  variance, multiple regression, descriptive discriminant

analysis,

 and cano nical correlation analysis. All of these an alyses use sample

data  to  compute sample statistics (e.g., sample means, correlation  coeffi-

cients) that

  are

  then taken

  as

 estimates

 or

 corresponding population param e-

ters

  (e.g.,

  p., a, p).

All of

  these  analyses invoke

  a

  single statistical theory

  fo r

  parameter

estimation called

  ordinary

  least  squares

 (OLS).

 These  estimates seek to max-

imize

 the

  sum-of-squaresMonEL,

 or its

 mu ltivariate equivalent,

 and the

  corres-

ponding  r

l

  effect  size  analog (e.g.,  R

2

,  r\

2

)  in the  sample data.

However,

  there

  are  (many)  other  parameter estimation theories  that

can  also  be  used. Indeed,  it can be  argued

  that

  classical statistical analyses

are  not

  very

  useful,  and may

  even routinely lead

  to

  incorrect research

conclusions, because classic al methods are so hea vily influe nced by outlie rs

(see

  Wikox,  1998).

Two alternative parameter estimation theories are

  fairly

  commonly

used

  in CFA  (and SEM). Hayduk (1987)  and  Joreskog and  Sorbom

  (1993)

provided more detail on  these  and other

  choices.

 When  sample  size  is large,

and if the model is correc tly specified, the va rious estimatio n theories

  tend

to

 yield consistent

  and

  reasonably accurate estimates

  of

 population parame-

ters (e.g., population factor pattern  coefficients).

First,  maximum  likelihood  theory is the de fau lt parameter estimation

theory  in  most statistical packages.  This  estimation theory attempts  to

/ 2 6

  EXPLORATORY

  AND CONFIRMATORY FACTOR

  ANALYSIS

Page 124: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 124/185

estimate  th e

  true population parameters,

  and not to

  reproduce

  th e

  sample

data or minimize the sample

  s u m - o f - s q u a r e s u N E x r L A i N E n

  while maxim izing

th e

  related sample

  effect

  size  (e.g.,

 R ',

  Tp).

 This  is

 accomplished

  by

 applying

maximum

  likelihood

  theory

  to the

  sample

  covariance

  matrix

  to estimate

the  population covariance ma trix, 2

V

x v, and then  derivin g factors to  repro-

duce that matrix rather than  the  sample covariance matrix.

Obviously, estimating population parameters with a statistical theory

that optim izes this estimation (rather  than  features

  of the

  sample data)

  is

very

  appealing. Better estimation of population parameters should yield

more replicab le results. This e stima tion theory also has the  appeal

  that

 most

of

  th e

  statistics

  that

  evaluate model

  fit are

  appropriate

  to use

  when this

theory  has

  been

  invoked. H owev er, m ax im um likelihood estimation theory

does require that

  th e

  data have

  at

  least

  an

  approximately mu ltivaria te

normal distribution.

Second,  asymptotically

  distribution-free

  (AD F) estimation theory may

be  invoked.  As the  name  of  this theory imp lies, these estimates tend  to he

accurate

 p aram eter estima tes regardless of distribution shapes as the sam ple

size is increasingly large. This is an appealing featu re of the theory. H owe ver,

very

  large sample  sizes

  of

 more th an 1,000 cases

  are

  required

 to

  invoke this

theory for some analytic problems.

Of course,  even  CFA  m ax im um likelihood estima tion often requires

larger sample

 sizes than

 would EFA modeling for the same data set (MacC al-

l u m ,

  Browne, (StSugawara,

 1996). W est,

 Finch,

  and

  Curran

  (1995)

 reviewed

some of the  relevant issues and  choices regarding th e  distributional assump-

tions of variou s para me ter estimation theories.

Jackson (2001) found

 that

  sample

  size  per se, and not the

  number

 of

parameters being estimated,

  is the

  critical ingredient

  in

  deriving accurate

CFA estimates. And sample  size  is less of a

  concern

  when the average

pattern

  coefficient  is

  |.80|

  or

  greater.

POSTANALYS1S DECISIONS

Once

  the  analysis is in  hand, various postanalysis decisions arise. First,

the goodness of fit tor

  each

  rival  model must be evaluated.

 Second,  specific

errors

  in

  model specification must

  be

  considered.

Model Fit  Statistics

A  unique feature  of CFA is the  capacity  to  quantify  the  degree  of

model  fit to the  data. However, there  are  dozens  of fit

  statistics,

  and the

properties  of the fit

  statistics

  are not

  com plete ly understood.

  As

  Arbuckle

and W othke

 (1999) observed, "Model evalu ation

  is one of the

 most unsettled

CONFIRMATORY  FACTOR

  ANALYSIS DECISION SEQUENCE

  \ 27

Page 125: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 125/185

and

  difficult

  issues  connected  with s tru ctu ral modeling [and CFA].  . . .

Dozens of statistics . . . have heen proposed as measures of the mer it of a

model" (p. 395).

Classically,

  Monte

  Carlo

 or  simulation studies  are used  to  explore  th e

behaviors  of  statistics under variou s condition s (e.g., differe nt sample

  sizes,

various

 degrees of violations of distributio nal assumptions). In Monte  Carlo

research, populations with known parameters

  are

  created (e.g., with

  an

exactly known degree

  of

 deviation  from  normal i ty) ,

 and

  numerous random

samples (e.g.,

  1,000,

  5,000) from

  these

  populations

  are

  drawn

  to

  evaluate

how the  statistics perform.

M any of the pre vious sim ulatio n studies of

 CFA/SEM

 model fit statistic s

specified

  populations associated with known models,

  and then

 drew samples

and  tested  the fit of the  known models.  Such  simulations studies  are  not

realistic

  as regards real research situa tions , because in real research the

investigator does  not  usually know  the  true model. Indeed,  if we  knew  the

true model,

  we

  would

  not be

  doing

  th e

  modeling study

Better simulation studies  of fit  statistics have created populations  fo r

certain m odels, and

 then

  intentionally evaluated the behavior of

 fit

 statistics

fo r  va rious conditions and v ariou s model m isspecifications. Recent s imula-

tion stud ies in this venue have pro vided im portant insights re garding how

the fit statistics perform (cf. H u &  Rentier, 1999; Fan,

  Thompson,

 &  Wang,

1999; Fan,

  Wang,

  &  Thompson,  1997),  but  there  is  still  a lot to  learn

(Rentier,

  1994).

Four

  fi t

 statistics

  are

  most commonly used today.

  In

  general,

  it is

  best

to   consult several  fi t  statistics when evaluating model  fit and to  evaluate

whether  th e  results corroborate  each other.

First,  the

  Jj statistical

 significance

  test

  has

  been

  applied since the origins

of  CFA.  As  noted  in  chapter  2, in EFA the  reproduced correlation matrix

( R v x V

+

)

  c a n

  °e  computed  by  mul t ip lying  P V X F  by  Ppxv ' -  T he  residual

correlation matrix (Ry

x

v  )

  l s

  computed  by subtract ing R y

x

v

+

  from  R y x X -

The

  same basic idea

  can be

  applied

  in

  CFA.

  One can

  estimate

  how

much of a covariance matrix is reproduced by the parameter estimates, and

how m uch is not. In CFA, the n ul l hypothesis t ha t is tested is essentially

whether  th e  residua l covariance estim ate equals  a  matrix consisting only

of

  zeroes.

Un like most sub stantive applications of null hypothesis statistical sig-

nificance

  testing,

  in CFA one

  does

  not

  w an t

  to

  reject this null hypothesis

(i.e.,  does not wa nt statistical sign ifica nce ), at least for the preferred model.

The degrees of freedom for this test statistic are a function of the number

of measured variables and of the  number of estimated param eters.  For exam-

ple,  if there  are

  eight m easur ed variables,

  the

  d/Toj^i.

  »

re

  36 ([8 / (8 + 1)] /

2).  If eight factor pat ter n coeffic ients, eight error variances, and on e factor

1 28  EXPLORATORY  A N D  CONFIRMATORY  FACTOR ANALYSIS

Page 126: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 126/185

covariance   are  estimated,  the  degrees-of-freedorriMODEi  equal  19  (i.e.,  36

  17).

How ever, as with conventional statistical significa nce tests, though

the sample  size  is not used in computing the degrees of freedom for the test,

the  sample  size  is  used  in  comput ing  the %

2

.  Even  for the  same input

covariance matrix  and the  same degree  of  model fi t , Jj  is  larger  for a  larger

sample  size. This  makes this

  fi t

  statistic

 not

  very  useful  when sample  size

  is

large, as it

  usually  should

  he in

  CFA, because with large sample  sizes

  al l

models will tend  to be rejected as not fitting. A s B entler  and

 B onnett

  (1980)

observed, "in ve ry large samples vir tua lly all models  that  one might consider

would

 have  to be rejected  as untenable.  . . .

 This

 procedure

 cannot

 generally

be  justified,

  since

  the

  chi-square value

  . . . can be

  made small

  by

 simply

reducing sample  size"  (p. 591).

The %   test  is not  very  useful  in  evaluat ing  the fit of a  single model.

But the  test  is very

 useful

  in evaluating the  comparative  f i t s of  nested models.

Models  are  nested when  a  second model  is the  same  as the first  model,

except  that  one or  more additional parameters (e.g., factor covariances)  are

estimated  or  freed  in the  second model.

The % "s and

  their degrees

  of

  freedom

  are

  addi t ive.

 For

  example,

  if for

the first

  model positing

  tw o

  uncorrelated factors

  the %

2

 is

 49.01  df  = 2 0 ) ,

and for the  second model estimating  th e  same parameters except that  th e

factor

  covariances

  are  freed  to be

  estimated

  the  yj  is

 31.36  df

  =

  19),

  then

the  differential  fit of the two

  models

  can be

  statistically tested using

 %

2

 =

17.65 (i.e., 49.01  -  31.36) with  df = 1 (20 -  19). The  ^ C A L C U L A T E D  of this

difference

  can be

  determined using

 th e

  Excel spreadsheet function

= c h i d i s t ( 1 7 . 6 5 , 1 ) ,

which  here yields f) =  .000027.

When

 ma ximu m likelihood estimation theory is used, and the mu ltiva r-

iate normality assumption

  is not

  perfectly met,

  the  J j

  statistic will

 be

  biased.

H owe ver, Satorra and B entler (1994) developed a correction for this bias

that  may be

  invoked

  in

  such cases.

Second,

  th e

  normedfit index  (NFI; Bentler

  &

  B onn ett, 1980) compares

the / for the  tested model against  the % for the  baseline model presuming

that  the measured variables are completely independent. For exam ple, if

the %

2

 for the tested C FA model was 31.36, and th e %

2

 for the baseline

independence model pr esum ing the measured variables were perfectly uncor-

related

  was

 724.51,

  the NFI

 would

 be

  (724.51

 -

  31,37)

  /

  724-51

 =

 693.15

  /

724.51

  = 0.96.  The NFI can be as

 high

  as

 1.0.

  Generally, models with NFIs

of

  .95 or  more  may be  deemed  to fit  reasonably well.

CONFIRM ATORY FACTOR ANALY SIS DECISION SEQUENC E 1 29

Page 127: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 127/185

Page 128: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 128/185

When

 maxim um likelihood estimation theory  is used, and the multivar-

iate normality assumption is not

  perfectly

  met, the standard errors for the

parameter estimates

  will

  also

  be

  inaccurate. Satorra

  and

  Rentier (1994)

proposed methods  fo r  estimating  th e  correct standard errors,  an d  conse-

quently correct critical ratios, using

 these

  standard errors as denominators,

by  taking  into  account data distribution shapes. Fouladi (2000) found  that

these

 corrections work reasonably we ll if sample

  sizes

  are 250 or more. Most

modern software provides these corrections  as  they  are  needed.

Parameters Erroneously

  Fixed

For

 each

  parameter

  fixed to not be

  estimated that could

  be

  estimated,

th e  CFA/SEM

  software

  can

  estimate

  th e

  decrease (improvement)

  in the

model fit

 jj

  resulting from

  freeing

 a given previously fixed parameter instead

to be  estimated.  In  some packages (e.g.,  A MO S )

  these  modification

  indices

are "lower-bound" or conservative estimates, and if a fixed pa ram eter is

freed  the  ~ f c  reduction predicted by the  software m ay be better than predicted.

Obviously, fixed parameters with the largest m odific ation indices are among

th e

  leading candidates

  fo r

 model respecification.

Model

  Specification

  Searches

Confirmatory factor analysis  is  intended  to be, as the  name implies,

confirmatory.  Respecifying

  a CFA

  model based

  on

  consultation

  of

 crit ical

ratio  and  modification index statistics is a dicey business,  i f  the  same sample

is

  being used

  to

  generate

  these

  statistics

  an d

  then

  to

  test

  the fit of the

respecified  model.

Using

  th e

  same sample

  to

  respecify

  th e

  model,

  and  then

  test

  th e

respecified  model, increases capitalization  on  sampling error variance  an d

decreases  the  replicability  of  results.  And  using  the  same sample  in  this

manner also turns

  th e

  analysis

  into  an

  explanatory one, albeit using

 CFA

algorithms.

As

 Thompson  (2 00 0c) emphasized, model respec ification should

 never

"be  based on b lind, dust-bow l empiricism . Models should only be respecified

in

  those

  cases  in  which  th e  researcher  can  articulate  a  persuasive, rationa le

as

  to why the

  modification

  is

 theoretically

  and

  practically defensible"

  (p .

272) .  Model respecification

 is

  reasonable when

  each change  can be

  justified

theoretically,   and  when  the fit of the  respecified model  can be  evaluated

in  a

  holdout

  or

  independent sample.

MAJOR  CONCEPTS

Exploratory factor analysis and CFA are part of the  same general linear

model

  (Bagozzi

  et  al., 1981). However,  CFA  models  can  estimate some

CONFIRMATORY

  FACTOR  ANALYSIS  DECISION  SEQUENCE

  13 \

Page 129: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 129/185

parameters that

  cannot

 he evaluated within EFA. For exam ple, only  in CFA

can the  covariances  of  error variances  be  nonzero  and  estimated.  In EFA

factors  can he

  either correlated

  or

  uncorrelated,

  hut in CFA

  addit ionally

some factors may be allowed to he correlated whereas

  other

 pairs of factors

in

 th e

  same model

 are

 constrained

  to he

  uncorrelated.

 O r,

  uniquely

 in

  CFA,

factor correlations  can he

  freed

  hut  constrained  to be  equal.  And in  CFA,

unlike

  EFA, all the pat tern coeff icients for a factor can he freed to be

estimated

  but

  constrained

  to be

  equal.

Confirmatory factor

  analysis

  m odels req uire the specification of one

or  more hypothesized factor structures, and the  analyses directly test  and

quantify

  the

  degrees

  of fit of  rival

  models. Because more

  than  on e

  model

may

  fit in a given

  study,

  it is important

  to

  test the fit of multiple plausible

models, so

  that

  any case for fit of a given model is more persuasive.

Only

  "identified" factor models

 can be

  tested

  in

 CFA.

  In

  CFA, degrees

of freedom are a function  of number  of measured variables and not of sample

size.  A

  necessary

  hut not  sufficient

  condition

  fo r

  model identification

  is

that

  th e

  number

  of

  estimated parameters (e.g., pattern coefficients, factor

correlations, error varia nce s) cannot exceed  th e  ava ilable degrees of freedom.

Ideally,

  considerably fewer parameters

  will  be

  estimated

  than

  there

  are

degrees

 of

 freedom,

 so

 that

  the

  model

  is

 more

  disconfirmable.  Fit

  when more

degrees of freedom are unspent is inherently more impressive.

 Some

  factor

pattern coefficients or  factor variances must also  he  constrained  to  obtain

model identification.

Like

  EFA,

  CFA

  requires various decisions, such

  as

  what matr ix

 of

associations (e.g., covariance m atr ix)

 to

 analyze,

 an d

  what estimation theory

(e.g., maximum likelihood)  to  use.

  Some

  estimation theories require that

data  are m ultivariate normal.  And  screening data fo r outliers can be part icu-

larly

 important when v ery elegant methods, such

 a s

 CFA,

  are

 brought

  to

 bear.

Confirmatory factor analysis model tests yield various statistics that

quantify  degree of model fit. It is generally best practice to consult an array

of

 f it

  statistics (Fan

  et

  al., 1999;

  H u &

  Rentier, 1999)

  in the

 hope that  they

will

  confirm

 each  other.

The

  analysis also yields criti cal ratios

  that

  evaluate whether freed

parameters were indeed necessary,  and  modification indices  that  indicate

whether  fixed  parameters instead should have been estimated. Respe cifying

models based

  on

  such results

 is a

  somewhat dicey proposition, unless theory

can be

  cited

  to  j u s t i f y

  model revisions.

  Such

  "specification searches" are,

however, completely reasonable  and  appropriate when  th e  respecified model

is

  tested with

  a

  new, independent sample.

132  EXPLORATORY  A N D  CONFIRMATORY  FACTOR  ANALYSIS

Page 130: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 130/185

1 1

SOME  CONFIRMATORY

FACTOR

  ANALYS IS

INTERPRETATION PRINCIPLES

Many princ iples of conf i rmatory fac tor analys is (CFA) mirror

  those

in explora tory fac tor analys is (EFA ; e .g ., the impo rtance of con sul t ing   both

pa t t e rn and s t ru c tu r e coe f f i c ien t s ) , g ive n the re la t ionsh ips o f these tw o

analyses wi thin

  th e

  genera l l inear model (GLM). However ,

  there  are

  also

aspects

  of CFA  t h a t  are

  u n i q u e

  to CFA

  model tes ts .

For

  e x a m p l e ,

  in EFA we

  a lmos t a lways

  use

  on ly s t anda rd ized f ac tor

pa t te rn

  coef f ic ien ts . How ever , in CFA  both  uns tanda rd ized pa t t e rn  coeffi-

c ien t s (ana logo us to regress ion uns tan da rd ized   b  we igh t s ) and s t anda rd ized

pat tern coef f ic ients (analogous to regress ion s tand ardized (3 we ights ) a re

c o m p u t e d .

The s ta t is t ic a l s ignif ican ce tests of

 i n d i v i d u a l

  C F A p a t t e r n

  coeff ic ients

are  conducted us ing  the  uns tandardized coef f ic ients . These  coeff icients  a re

c o m p u t e d

  b y

 d i v i d i n g

 a

  g iven pa rame te r

  by i ts

  re spec t ive st anda rd e r ror

  to

compute a  t  (or Wald or c r i t ica l ra t io) s ta t is t ic . Noteworthy uns tandardized

coefficients  us ing

  these

  tests also indicate  the  noteworthiness  of the  corres-

pond ing s t anda rd ized pa rame te rs , a s in mul t ip le reg res s ion .

H o w e v e r , w h e n p a r a m e t e r s a re compared w ith  each

  o ther ,

  usua l ly  th e

s t andard ized

  coef f ic ients  a re  reported, again  as in  m ul t ip le reg res sion .  T he

standa rdization allows apples- to-apples

  comparisons  of  related  parameters

with in   a  single model.

133

Page 131: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 131/185

Five  aspects  of CFA  p rac t ice  are  elaborated here .  These  inc lude  (a)

a l t e r n a t i v e

  wa y s

  to

  create

  ident if ied

  models ,

  (b )

  some es t imat ion theory

choices,

  (c) the

  impor tance

  of

 consu l t ing

  b o th

  p a t t e r n

  and

  s t ruc tu re  coeffi-

cients when C FA factors are cor rela ted , (d) com par ing

  f i t s

  of nes ted models ,

and (e) ex tract ion o t h igher -order factors .

In

  these i l lus tra t iv e analyses ,  th e  Append ix  A  da ta p rov ided  by the

100   facu l ty  m e m b e r s  for the  f i r s t  eight i tems  are  used . Readers  may f ind i t

useful

  to conduct paral le l analyses us ing the append ed data provided by the

100   g radua te s tuden ts .

MODEL IDENTIFICATION

  CHOICES

A s

  explained

  in

  chap ter

  1 0 ,

  parameters  cannot

  b e

  est imated  unless

th e

  m o d e l

  is

  " iden t i f ied . "

 A

  m o d e l

  is

  u n id e n t i f i e d

 o r

  under iden t i f i ed

  if for

t h a t  model an  inf in i te  num ber o f es t imates fo r the paramete r s a r e equa l ly

p laus ib l e .

A necessary bu t  insuff ic ient  condit ion for ide nt if ic at io n is that the

number o f paramete r s be ing es t imated mus t be equa l to   ( j u s t  ident i f ied)  or

less  t han

  (ove r iden t i f ied ) the ava i lab le deg rees o f f r eedom. In th i s exam ple

the

  n u m b e r

  of

 m e a s u r e d v a r i a b l e s

 is

 eight . T herefore ,

  the

  ava i lab le deg rees

of  f reedom

  are 36 ([8 x (8 + 1)] / 2).

T h e

  models tes ted here presum e that i tems PE R I through PER4

  reflect

only

  an

  under ly ing f ac to r named  "Service  Af fec t"

  and

  tha t i t em s PER5

through PER8  reflect  only an u nde r ly in g factor named "Lib rary as Place."

Thus ,

  a total of up to

  eight factor pat tern  coefficients

  are

  es t imated .  Also,

th e  er rors var iances  for

 each

  of the  e igh t measu red var iab les are  es t imated .

I t is assumed here tha t the factors are unc orrela ted .

H o w e v e r ,

 for

 th is model

  w e

 c a n n o t es t im ate these

  16 (8 + 8)

 paramete r s

because  th e  scales of the two  la ten t factors have  no t

  been

  indicated .  These

scales

  of

  m e a s u r e m e n t

  of

  la ten t constructs

  are

  a r b i t r a r y . T h a t

  is,

  there

  is

no

  in tr ins ic reason

  to

  th ink tha t "L ib ra ry

 as

  Place" scores

  in the

  p o p u l a t i o n

should have

  a

 p ar t icu la r va r iance

 or

  s tandard dev ia t ion .

  Bu t we

 m us t dec la re

some scale for these late nt va riab les, or w ith

 each

 of

 infini tely

  m a n y

  di f ferent

possible var iances there w ould

 b e

 inf in i te ly m any corresponding sets

 of

 o ther

parameters being es t imated .

There  are two

  bas ic  ways

  to

  iden t i fy

  C FA

  models.

  First ,  for a

  g iven

fac tor  we can "f ix" or constrain

  any

  g iven un stand ard ize d factor pa t tern

coefficient

  to be

  a ny  m a th e m a t i c a ll y p l a u s i b l e n u m b e r . H o w e v e r ,  usua l ly

the number 1 .0 is used for th is purpose. In doing so , we are scal ing the

var iance  of the  la ten t construct  to be  measured  as  some func t ion  of the

measured

  v a r i a b l e

 fo r

 w h ic h

  we fix the

  pat tern coeff ic ien t .

 Some

 researchers

134  E X P LOR A TOR Y  A ND  CONFIRMATO RY FACTOR ANALYSIS

Page 132: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 132/185

prefer  to f ix the

  pa t t e rn

  coef f i c i en t  for the  measu red var i ab l e be l i eved

to be

  mos t p sy chom et r i ca l ly r e l i ab l e . Ho wev er ,

  the

  se l ec t ion

  is

  bas i ca l ly

arb i t r a ry .

Second,  w e c a n

  es t imate

  all the

  pa t t e rn coef f i c i en t s

  and f ix the

fac tor

  var iances to any m athe m at i ca l ly p l aus ib l e o r "admiss i b l e" num b er .

Because v ar i ance s canno t be nega t ive , adm iss i b l e num be rs fo r var i an ce

es t imates

  m ay no t be

  n e g a t i v e .

  And the

  factor var iance

  c a n n o t  b e

  zero ,

o r we wou ld be eva lua t ing the

  inf luences

  of a var ia b le tha t was no longer

a

  v a r i a b l e .

When

  factor var iances

  are f ixed,  usual ly  th e

  var i ance paramete r s

  a re

f ixed to be

  1 .0 .

  This

  approach

  has a

  s ingu la r advan tage :

  When

  we a re

es t imat ing fac to r covar i ances ,

  th e

  covar iances

  of the

  fac to r s wi th

  each

o t h e r  now

  become fac to r co r re l a t ions , because g iven tha t

  r\

 x

 n =

  COV[

  x

 n /

(SDj

  x SD j i ) , when  the  fac to r var i ances  a re both  1.0,  w e  o b t ai n C O V ]

x l

j  /

( 1 x 1 ) .

 This

  makes in t e rp re t a t ion more s t r a igh t fo rward .

In  add i t ion ,

  t

  s ta t i s t ics  a re  o n l y o b t a i n e d  fo r  es t imated fac to r pa t t e rn

coefficients .

  So i t i s help fu l to kno w the

  s t a t i s t i c a l

  s ign i f i cance , o r W ald

rat ios,

  fo r

  each factor

  p a t t e r n

  coef f i c i en t ,

 a n d

  these

  w o u l d

  no t be

 a v a i l a b l e

i f  th e

  model were

  ident i f ied  b y

 cons t r a in ing some fac to r pa t t e rn coef f i c ien t s .

T h e

  approach also

  has the

  a d v a n t a g e t h a t

  a l l the

  pa t t e rn

  coeff ic ients

fo r  a

  g iven factor

  can be

  com pared w i th each o ther , b ecause they were

  a ll

freed.  In

  genera l ,

  w e m a y

 expec t

  t h a t

  pa t t e rn

  coeff ic ients  on a

  g iven fac to r

wil l  b e

  s i m i l a r

 in

 m a g n i tu d e s .

Table

  11.1 compares three analyses

  o f

 these

  da t a wi th th ree

  d if fe ren t

model i den t i f i ca t ion s t r a t eg ies hav ing been invoked . In Model 1

 a rb i t r a r i l y

the

  uns t and ard ized pa t t e rn

  coeff ic ient  for

 v a r i a b l e P E R I

  on

  Factor

  I

  ("Service

Affect")  was f ixed to

  equal 1.0,

  and the

  uns t andard ized pa t t e rn coef f i c i en t

tor var iab le PER8 on Factor I I ("L ib rary as Place") was f ixed to equ al 1 .0 .

T he

  fac to r var i ances were

  b o t h

  "freed"

  to be

  es t imated .

In

 Model

  2

 a rb i t r a r i ly

 the

  uns t andard ized pa t t e rn

  coeff ic ient  fo r

 v a r i a b l e

P E R 2

  on

  Factor

  I

  ( "Serv ice A f fec t " )

 w as f ixed to

  equal 1.0,

  and the

  u n s t a n -

dardized

  pa t t e rn coef f i c i en t

  fo r

  v a r i a b l e P E R 5

  on

  Fac to r

  II  ( " L ib ra ry  as

Place")

  was f ixed to

  equal 1.0.

  T he

  fac to r var i ances were

  b o t h

  "freed"

  to

b e

  es t imated .

In

  M o d e l

  3 ,

  po r t r ayed g raph ica l ly

  in

  Figure 11.1 ,

  a ll

  e igh t pa t t e rn

coefficients

  were es t imated . How ever ,

  to  identify  th e

  model ,

  b o t h

  fac to r

var iances were const rained

  to be

  1.0.

N o t e

 f rom

 T a b l e

  11.1 that

  th e

  s t andard ized pa t t e rn coef f i c ien t s

 and the

fac tor

 s t ru ctu re coeff ic ien ts are ide nt ica l across the three m odels, reg ard less of

how the

  model s were

  iden t i f i ed .  And the f i t

  s t a t i s t i cs

 a re

  i d e n t i c a l .

  S o m y

preferen ce for ident i fy ing CFA models by f ix ing factor var ianc es i s arb i t rary

bu t perm iss i b l e as a mat t e r o f s ty l i s t i c

 choice .

SOME  INTERPRETATION PRINCIPLES  135

Page 133: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 133/185

C O

C O

Q)

2 _

x:

1  —

73

C D

CD

0

*

C

o

C O

O

?±I

^

D

~

5•*""

 V

 ^

 >1

0

 0

C D   LL

Q o

O '

r

 

T—

"tjj

  |l

T

 -C

uj 55 "o

_J •* —   o

DQ

  LL .c

p* (_j fl)

"~ ~

  C _ ;

c o   i

IE

E

m J

_

it *

C D

  n|

O

  '

CD

S-_

0

-I—

C O

"co

CL

O

O

C O

C D

N

~O

C O

c

C O

5 5

CO

'C D

TJ

o

-

CM

*

 C D

T3

o

^

2

*

CD

T3

0

C D

f5

5 5

c

C D

~

C O

Q.

C D

3

3

C O

c

C D

e

CO

Q.

C D

p

0

£

C

C D

to

Q.

~ s

.c

c t

(j

>

_

-

o

0 0

CO

t/5

o c

o c

0C

0C

0C*

in a

CT) a

0C

00

L O

  a

a>   a

d c

0C

0C

o c

0C

o o

in a

o a

0C

o

in g

2;

o c

0C

0C

o c

o o

m a

en

 a

o c

1

II

I-

 C

D C D C

LU LL

C L  Q

3 C

3 C

3 C

3 C

J

  (i

3  0

D C l

3 C

J C l

3

  0

3 C

3   C

3

  C

3 C

3 C

3 C

J C

3

  0

3 C

3 C

j

u

C

is

3 C

3   C

3   C

3   C

J C

3  a

3 C

3   C

J

  U

3 C

3   <i

) C

J   O

: Q

J  U

. Q

3 O •* CO CO CO

3 O

  CO

  O

  CO

 CT)

3 O O3 O3 CT) h~

D O O O O O

D r-- o o o o

D

 r- o o o o

D

  CO O O O O

3

 O0000

• CO CO CO  £?•

i  CO O CO CT) S

|  (33 O)CT)h~ P

ddddO

D   r*- ^r* r^ o

  • * —

  c o o > C D

3 h - i i

  S  CMCM^l-COini-

3 C O | |  R oo

  ^03(331-

3

 O O •* OOO

3

 O •* 00CO CO

3 O

  CO

  O

  CO

 CT)

3 O CTi

 C33

  O3 f .

3 O O O O O

3

 r- o o o o

3 h~ O O O O

3  CO

 O O O O

3

 O0000

JT

  CO

 CO CO O

i S O  CO

  C33

 CO

o-d do co

3t^ U3

  h^Oi—

C O O 3 C 3 3

3 r ~ 1 • *

  CMoa^coini-

}°°

  1 00 CO

  CM

0

?

0

?"":

3 O *- f O O O

3 O "^ 00 CO CO

3

  O CO O CO O3

3 O C33 O3 O3 1

3

 O O O O O

3 r- o o o o

3 K O O O O

3 000000

3 O O O O O

^ oo co

 z?

  ro

i CO O CO £? CT)

|

 CT)

 CT>

 CT)

 gj 00

dddo c\i

3T 1 1 O

•»—

  COCT)CT)

3 r - i 1 C D   C M C M ^ r c o i n i -

3C O  | |C D 0- j

  w

-C33C33i-

3  o i- •* d o d

s   <

^ • ^ L O C O h - O O o ^

  ^

10:0:0:0:0:  ^3  •- % ^

JUJLULULU LLJ JcJo

  N

  ^ LLLL

_

 Q_ Q_ Q_

 Q_

 Q_

  i > x"O x2 OOC

0)

C/}

£

CO

CL

_ c

03

t:

o

CL

03

0)

to

 C D

•a

o

"O

N

(C

TJ

c

_ c o

^

3

1

0

T-^

  rO

  .

13   c:

°E

"g   o

x  CL

3- CL

 « 

*- o

E

  2

t o Q J

*u

  0

Q) CD

a>   g-

c <p

2

c

C O C O

p

Jl

w  "

  w

  * * • *

-Si

w DZ

-D

  ?

1-g

O.S

C L —

Ii

if

s i

>   |,

o 01

r u

-—

  "D

"w

 ~

C D   ~

ra

 -a

E  g

-5

  ^

Q)   O

^ c

03

 _

C

ll

|z

^1

o "°

s ffl

J 3 6 E X P L O R A T O R Y ' AND CONFIRMATORY FACTOR  A N A L Y S I S

Page 134: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 134/185

e1

e 2 )

  (

  e 3 j ( e 4

Perl  iPer2J  P e r 3

Per4i

.

  tt

Per5

T

J

/-"

  -\

(   e 5 )

Per6

 V

i

i

' e 6 )

Per?

r

Per?

V T

A A

(   e7 ) ( e8

Figure

  11.1.  C on f irmatory fac to r ana lys is fac to r model wi th e ight meas ured variables

and two

  co r re la ted fac to rs wi th fac to r var iances cons t ra ined

  to

  equal 1.0.  ServAf f

 =

"S erv ice

  Af f ec t " ;  LibPlace   =  "Library as  P lace. "

SOM E

  I N T E R P R E T A T I O N

 P R I N C I P L E S

137

Page 135: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 135/185

D I F F E R E N T

  ES TIM A TIO N TH EO R IES

As noted in chapter 10, classical s tatis t ic s (e.g. , analysis of variance,

regress ion, canonica l corre la t ion analys is ) invoke

  a

  s ta t is t ica l es t imat ion

theory called ordinary

  (east

  squares

  (OLS).

  However , the re  are  n u m e r o u s

o ther  s ta t is t ica l es t imat ion theories

  t h a t

  c a n b e  invoked  to  e s t ima te mode l

pa rame te rs .  T he  theory most commonly used  in CFA  (and  in  s t r u c t u r a l

equa t ion m odel ing [SEM]) is m axim um l ike l ihood theory. Most of the f it

s tatis t ics

  have

  heen developed for this theory.

How ever , max im um like l ihood e s t ima t ion does p re sume that  data  a re

mul t ivar i a t e  n o r m a l l y d i s t r i b u t e d .

 A n o t he r

  es t imat ion theory, asymptot i -

cally  d i s t r i b u t ion - f ree (AD F) theory , can b e invoked i f the d i s t r i bu t iona l

assumptions

  o f

 m a x i m u m li k e li h o o d

 c a n n o t  b e

  me t . However ,

  A D F

  es t ima-

t ion

  does presume large sample s ize.

For com para t ive purposes ,

 Ta b l e

 1 1.2 presents

  the

  s tandardized pat tern

and the  s t ruc tu re  coeff ic ients  for the  Mode l  3 f it to the  facul ty da ta us ing

three  different  es t imat ion theories .

  N o t e  t ha t  th e

  pa t t e rn

  an d

  s t ruc tu re

coefficients  es t imated in a l l three models are reasonably comparable .

These  analyses were compared pr im ari ly for heuris t ic purposes . How -

ever, i t is not entirely a bad th ing to inv oke   d if fe ren t  analytic s trategies for

the

 same da ta , j u s t

 to

 conf i rm t ha t  results

 are not

  ar t i fac ts

 o f

 ana lyt ic choices .

In this ins tance , probably only   the  maximum l ike l ihood e s t ima tes would

b e

  reported,

  but the

  researcher would sleep easier knowing

  t ha t

  s imilar

factors

  emerged for the other analyses.

CONFIRMATORY

  FACTOR

  ANALYSIS PATTERN

  A N D

S T R U C T U R E  C O EF F IC IEN TS

In EFA, or thogonal solut ion s a lmost a lways provide s im ple s t ruc ture ,

are the  defaul t models  in  s ta t is t ica l packages,  and  d o m i n a t e  the  l i t e ra tu re .

How ever , in CFA, i t i s more   usua l  to test either correlated factors models

or  b o t h

  uncorre la ted

  and

  correlated factors models .

  That  is ,  usua l ly  b o t h

C FA   models  are  reasonably plaus ib le ,  and we  w a n t  to  tes t p laus ib le

  r iva l

models

 t o

 have

  the

 strongest possible supp ort

 for our

 preferred model , presum -

in g  that

  this model u l t imate ly

  has

  best

  fit .

When  factors  a re  u ncor re la ted ,  the CFA  s tandardized fac tor pa t te rn

and the s t ruc tu re  coeff ic ients  for g iven v ar iab les on given fac tors exact ly

equa l  each  other ,  as  ref lected  in the  i l lus t ra t ive resul ts reported  in

  Ta b l e

11.2. Bu t wh en CFA factors are correlated , exa ctly as is the case in EFA ,

factor  pa t t e rn  and  fac tor s t ruc ture coef f ic ients  are  no

  longer  equal.

For mul t ip le regress ion (Courvi l le  &

  T h o m p s o n ,

  2001;  Th o mps o n

&   B orre llo , 1985) , desc r ipt ive disc r im inan t analys is (H ub erty , 1 994) ,  and

/

 3 8

  E X P L O R A T O R Y

  A N D

  C O N F I R M A T O R Y

  FACTOR

  A N A L Y S I S

Page 136: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 136/185

C O

C D

b

CD

H

c

o

 

C O

.E

"5 5

UJ

Q)

C D

il

-C

1

 —

E

o

LL

C D

>

CD

'

  Q

co

=tt:

C D   ,

E

\

^ «=T 

T —  ^ O

^~" A 1

O LL

  w S

C D

 .0 0

^

  C O I

ro c

C O

  "-"

^

LL

C

C O

tn

"c

CD

  C D

o

O

H _

, .

o

2

55

£

03

•+-rf

03

C L

0

o

c o

u _

-o

0

o

 

CD

 

E

 

^ E

C O

2

LL

Q

C O

T3

Q

£

cn

2^

>

c

 »

j

2

  G

 

5 5

c

S

to

CL

£?

3

D

5 5

c

CD

C O

D L

CD

.•

O

5 5

^

-S -

  c o

Q _

* p j -

^

n

CO

>

Z

 

~~

o

 

to

c

c

c

c

u

o

c

c

u

c

c

c

c

c

c

a

c

c

c

a

o

c

c

c

c

a

T;

0

C

a

^

a

c

a

u

Q

3 C

3 C

3 C

3 C

3

  r

3

 a

3

  a

3 C

3

 0

  a

> >

  a

3 C

3 C

3 C

3 C

3 C

3 CI

3 C

15   C

3 C

3 C

3 C

3  a

3 C

?E

3 C

3 C

15   U

r  u

1 3   a

3 C

3 U

-   u

3

  a

3 C

: a

J LL

.Q

3 E

3 C

3 C

vJ   CI

3

 a

3 Ct

3 C

U   C I

3 C

3 C I

3 C

3 C

3 C

3 C

3 C

3 T-

J T

D   a

3 C

3 T

U

  T

3

 a

3 C

3 C

3 C

3 C

3 C

3 C£

3 0

3 U

3 C

3 c r

3 C

3 CI

3 C

: a

J U

.Q

3 C

3 C

3 C

3 C

3 r

3 r-

3

  a

3 C

3

  r

3 l~

3  a

3 C

3 C

3 C

3 C

3 C

- 0

- C

3 C

3 C

- a

- C

3 O

3 C

? E

3 C

3 C

3 •<;

D

  C

3   a

3 C

3

  ^

3 C

3  a

3 C

•\

  -

: a

J U

.Q

3 <;

3

 0

3 O

3 C

- C

- C

3 C

3 C

•^

P

a

c

-

3

3

3 h

3 r-

D

  a

3 C

0 C

£ £

3 C

h

h

a

c

3

3

3

3 ?

3

 0

3 C

J-

  C

S E

3 C

C

0

a

c

f

3

15

 

" L

: a

J U

- Q

5 ?

3 O

3 C

3 C

3 C

3 C

3 C

f  a

3 C

1 5   a

3 C

- CI

- c

D

  a

3 C

3 C

3 C

3 C

3 C

- U

- O

1 5   a

3 C

3 g

15

  0

3 C

3 C

3 C

3 C

3 C

3

 a

3 C

i a

3 C

 \

  ^|

:

 a

J U

.Q

3 O

3

 C

3 a

3 C

3 C

3 C

3 C

3 C

3

  P

3 C

3

 a

3 C

3

 a

3 C

3

 a

3 C

3 C

3 C

3 C

3 C

3  a

D

  0

3

 a

3 C

3 O

3

  0

•5  C

3 C

3 C

3 C

3 C

3 C

3

  a

3 C

3

  a

3 C

5 h

: a

J U

.D

3 P

3

  0

3 h

3 C

3 E

3 C

3 C

3 C

3 C

3 t^

3 C

3 P

3 C

3

 0

3 C

3 C

3 C

3 C

3 C

15

  P

1 5 C I

3  a

3 C

3 T

j a

3 r~

3 C

3 C

It

3 C

3

  ^

0 C

3

 f

3 C

: c

J U

.D

3

13

3

3

3

3

3

(5

3

h * " »

  ^D ^

  CO O5 O5

C\J CVl ^~ CO LO  •*""

if r\"i  ^ ^^

 T

"~

•*  odd

•3

3

D

3

3

3

3

3

3

3

co

 o un co oj T-

05  CV J

  h °° T

XT  000

)

3

3

3

3

t-

5

3

CD

  «* T~ .w

C O - • i ~ ~

 —  \  C

 

—   CO C c~

i- O

CO

UJ

j  .   07 iZ

x"C>

  x'Z.

  O DC

&,

o

0)

£

u?

o

o

j=

c

0

3

.Q

in

• —

>,

"c o

o

0

a.

E

 

CO

u

Q

  .

 C

•   - .9

J C " c o

i_

  F

3 X

m

  Q.

« §

~O

  i-

5i  2

" a>   ®

' C D Q >

11

=J   cr

M   ?

  c u

- D   E

-2 -g

a>  2

5

5 UJ

^OT

J  ^

"~^

  x"

LO   QJ

CD   "D

CO

  •*-"

«

 

D ^

(D  O

*-

  O

( U I I

S

Lt

So

2 x

0) CD

N

  -0

C D   .E

-° .—

0 ~

  Q

^ E

.> <

  0

?- c

^

  Z -

" to   ^ J

o S

 S o.

E  a

C O C O

D?g

Q j I I

~O CO

S d

SOME

  INTERPRETATION

  PRINCIPLES

139

Page 137: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 137/185

canonical  co r r e l a t io n ana lys i s ( c f . Cohen

  & Cohen,

  1983,

  p.

  4 5 6 ; L e v i n e ,

1977 , p . 2 0 ; Mere d i th , 196 4 ; Thompson,  Z O O O a ) , w h e n w e i g h ts an d s t r u c t u r e

c oe f f i c i e n t s

  a r e

 u n e q u a l , s e n s i b l e i n t e r p r e t a t i o n  u s u a l ly  r e q u i r e s e x a m i n a t i o n

o f bo th se t s o f r e su l t s . S t ruc tu re c o e f f i c i en t s a r e equa l ly im p o r t an t i n C FA

(cf .

  G r a h a m

  et

  a l . , 2 003 ;

 Thompson,

  1997) .

  Thompson, Cook, and

  H e a t h

( 2 0 0 3 ) p r e s e n t e d

  i l l u s t r a t i ve

  s t r a t eg i e s  fo r  r e p o r t i n g r e s u l t s  fo r  co r r e l a t ed

CFA  f ac to r s .

P u t  d i f f e r e n t l y ,  a s B e n t l e r a n d Y u a n ( 2 0 0 0 ) e m p h a s i z e d , i n C F A w h e n

fac tors a re

  corre la ted , "even  if

  a

  f ac to r do es

  n o t

  i n f l u e n c e

  a

  v a r i a b l e ,  that

is ,  its  p a t t e r n c o e f f i c i e n t  o r  w e i g h t  is  zero ,  th e  c o r r e s p o n d i n g s t r u c t u r e

c oe f f i c i e n t ,  r ep resen t ing  the  co r r e l a t io n  or  c o v a r i a n c e  of the  v a r i a b l e w i t h

that

  f ac to r , gene ra l ly  w i l l

  not  be

  zero"

  (p.

  3 2 7 ) .  There

  are

  th ree po ss i b l e

s i t ua t io ns th a t a r i s e w i th co r r e l a t ed f ac to r s , no t c o un t ing co m bina t io ns o f

t h e s e s i t u a t i o n s .

 nivoc l

  Models

O n e

  po ss i b i l i t y

  is

  t h a t

  each

  m e a s u r e d v a r i a b l e

  re f lec ts  th e

  i n f l u e n c e s

of  only  a  s ing le  one of the  co r r e l a t e d f ac to r s . Mo d e l  4  p resen ted  in  Table

11 .3 invo lves such a case.

F o r  such "un ivo ca l " m o de l s

  in

  w h i c h  each  m e a s u r e d v a r i a b l e s p e a k s

o n

  beh a l f

  o f a

  si n g le l a t e n t v a r i a b l e ,

 th e

  freed  s t a n d a r d i z e d p a r a m e t e r s

  an d

th e f ac to r s t ruc tu re co e f f i c i en t s a lw a ys exac t ly  match.  F o r e x a m p l e , P E R I

had a pa t ter n c oe ff ic ie nt o f Facto r 1 in Mod el 4 o f .950; the corresp ond ing

s t r u c t u r e

  c oe f f i c i e n t  also  w a s  exa c t ly .950 .

H o w e v e r ,  note  that  none  of the  s t r u c t u r e c oe f f i c i e n t s  a s so c i a t ed w i th

p a t t e r n c o e f f i c ie n t s fixed to  equa l zeroes were themselves zeroes . For  e x a m p l e ,

P E R I

  h ad a pa t t e rn co e f f i c i en t o n  Factor  I I i n Mo de l 4 f i xed to equa l ze ro ,

b u t  the

  c o r re s p o n d i n g s t r u c t u r e c o e f f i c ie n t

  was

  .398.

In other  w o rds , w h en fa c to r s a r e co r r e l a t ed , a l l m easured v a r i ab l e s a r e

co r re l a t ed w i th  a l l f ac to r s . This  dyn am ic m us t h e co ns ide red a s pa r t o f r e su l t

i n t e r p r e t a t i o n .  The  h i g h e r  the  f a c t o r c o r r e l a t i o n s  are in  a b s o l u t e v a l u e s ,

t h e m o r e a l l v a r i a b l e s

 w i l l

  be co r r e l a t ed w i th a l l t h e f ac to r s .

Correlated Error Variances

A  unique  f e a t u r e  of CFA  v e r s u s  EFA is the  p o s s i b i l i t y  of estimating

covariances

 or

  correlations

  of  m e a s u r e m e n t e r ro r  variances. If the  model is

correctly

  spec i f ied ,  then

 the

  e r r o r

 variances of the

  m e a s u r e d v a r i a b l e s r e f l e c t

score  u n r e l i a b i l i t y ( s e e  Thompson,  2 0 0 3 ) .

For

  e x a m p l e ,

  the var iance of the

  m e a s u r e d v a r i a b l e P E R I

  was 1.847.

In Model 4 the  estimated  error variance for this  v a r i a b l e  was  .178.  This

means  that,

  if the

  model

  is  correctly  spec i f ied ,  the  u n r e l i a b i l i t y  of  t h i s

140

  EXPLORATORY

 A N D

  C O N F I R M A T O R Y

  FACTOR  ANALYSIS

Page 138: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 138/185

~ "

0

C O

LL

O

O

T

~

||

C;

C O

t_

0

C O

'-*-

"O

0

jo

2 ?

o

O

o>

 • —

 

'co

O

CL

~ TT7

""" T3

UJ  O

C O

< c   ®

-C

o

C O

0

  5 5

'-4-J

rri

U_

T3

CO

C D

03

E

  co

01

03

E

jO

C O

CL

CD

  C D

T3

o

CD

|-r

O

C O

c

C D

JJ

CO

Q _

=

in

*

"o

T3

o

C D

-3

u

C O

^—

c

J D

D -

=

^

  C D

-o

o

^^

2

r

CO

c

t=

CL

 

—   .y

CO

  .^0

'^_  •» —  

C O

  5

>C/3

r ^ L O h ~ T t - c o o c o o >

cococ\ i coa>a>a i ^

oooooooo

roc\joocDc\jcMC\ia>

c

c

•>   a

3 C£

3

  a

D C O C O C *

5   ^1

3OOOOOOC

i-

>

^- oo co i-

C O

  O CO T-

O3

  CD O) t

dddd

CT>   C \ J OO CD

^~

  co co h i i

c n c o C D o o I 1

dddd

CD fljOOCOOOh-QOCOCM

w c fc o^^ ^C TJcno

o 'CM o d o

OJI^OCDCOOCOCT)

CO

  C O C O C O   O) O ) O) t*^

dddddddd

OOOCD-r-OOT-^ -

COO31-COC500CDOO

c

n a

D   r

- 0

  C O C O C -T

  0

^

dddddddd

CM   h- ^

J-   aD

1

  C O O C O O )

I o  C D   cn

dddd

CO

 O CD 1-

CO

  CD T— 00 i I

CD

  OO h

1

CO

 | |

d d o o

h * o ) T —

  c o o c o r ^ - T —

N T —

  a > i —   mcDcor--

C M

  ^3- y-j ^_: a> a) o

i '

 cvi

  o o d

CDCDOOCOCOOCOO3

c o c o c M c o c n c n c D r ^

oooooooo

c

D C

n

  c

3 r-

~ O O  r

- •*;

Uj  i^-  *jj  i^- wj  \j±i

  i^ij

  vj

c n c o c D o o c o c o c o r o

oooooooo

CM  (^ ^

C O  O 0

CD C3 > C

J-

  00

o

  a

o

  h

D

d

ddd

o

  a>

 o h-

m r - cn r- i i

O)  CO CD 00 | |

c

D C

3 C

3 C

V

3

•» -

  1  1~ »  tr\  iv

f jocooinr- ' -o jT-

^ T - C O - T - C O L O G O O O

C "^" " O5 O5 O

' co odd

CD

Q)

  v

— ^[

a: cn cc ct a: cc ir cr  > ^  <£

U J L U L U U J U J U J U J L I J  Ocoa j

  •^<\r~

|J

-

lJ

-2

0 - C L C L Q - Q - C L Q _ C L  OLL  ?-i~C3  X~Z- OCC

'

c a

0)

£

8

"

^

II

<

LU

< n

2

CL

X

0

•Q

£Z

iH

(D

>

  c 5

CO

Q.

o

_

o

X

O

-o

._ c

^

  S

o

£=

LL

 Z .

0)

n

0]

CJ

"a.

8 -

0

c:

II

cc

c

r^

J

CO

(D

J

w

ro

•a

fa

-o

 

- ±

0

a_

ff

CO

LO

<u

N

S

Q

T3

0

X

a >

  5

11

  X

-4=  O

S

 a.

W

  CL

Q)

  "3

a5

  *o

£ -

CO

  O

s ®

Q J

QJ   co

Q

 =

g

CT

 

SOM INTERPRET TION  PRIN IPLES

 

Page 139: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 139/185

measured var iab le could

  he

  e s t ima ted

  as

 .178

  /

  1.847

  =

  .096

  (o r

  9 .6%).

  The

corresponding  re l i a b i l i ty  of this measured var iab le could be es t imated as

(1.847  -  .178)  /  1.847  =  1.669  /  1.847  = .904  (or

 90 .4% ;

 or  100%  -  9 .6%) .

Measurement error variances in classical s tatis t ics are   usua l ly  taken as

be ing random

  and

  therefore independent . However ,

  in

  some cases elements

of  measurement error may be sys temat ic and overlapping. For example , in

a  s tudy involv ing pe rcep t ions  of  gender roles ,  if two  i t ems  in a  given  s tu d y

invoked s t rong hom ophob ic emo tional reac t ions , the error var ianc es for

these

  tw o

 i t ems migh t

  b e

 corre la ted  even  t h o u g h t h i s d y n a m i c

  is

 ex t raneous

to the  p r im ary f ea tu re s  of the  model .

Model

  5

  allows

  the two

  error var iances

  for the

  Factor

  I

  measured

var iab les ,

  "Giving users  in d iv id u a l  a t tent ion" and "Employees who deal

with users  in a  caring fashion,"  to  c o v a r y . T h i s  add i t iona l s ing le pa rame te r

es t imate cos ts one addi t ional degree of f reedom, as reported in

  Ta b l e

  11.3.

B ut

  only a l lowing error covariances  to be  nonzero does  no t

  change

  the

s t ruc tu re  coeff icients associated w ith freed  pat tern coef f ic ients, and the

  freed

pat tern coef f ic ients

 and

  re la ted s t ruc ture coef f ic ients

  wi l l

  remain equal .

Multivocal Models

Model

  6 is

 presented graph ica l ly

  in

  Figure 11 .2 .

  In

  this model there

is  o ne

  measured va r iab le (PER8) th rough which  b o t h  factors speak. Note

t h a t

  fo r  such var iab les  the  freed  pat tern coef f ic ients  no  longer equal  the

associated structure coeff icients (i .e . , .206

  ^

  .489,

  an d

  .711

  ^

  .793) .

C O M P A R I S O N S

  OF  MODEL  FITS

In any CFA (o r  SEM) m odel tes t ing, mu l t ip le models  ma y f it the  same

da ta .  T h u s ,  i t is not best practice to test only  a  single, preferred model

(Th o mps o n , 2 0 0 0 c ) .

  Ins tead, i t i s des i rab le to tes t m ul t ip le plau s ib le r iv a l

models .

 Then,

  if the p referr ed mod el is the only m odel w ith reasona ble f i t ,

the case for  t h a t  mo del i s correspondingly s t ronger .

In a fac tor analyt ic context ,

  r iva l

  models would of ten inc lude a model

speci fying  independence

  of the

  measured v ar iab les ( i .e .,

 no

  fac tors ) ,

 a

  mode l

posit ing a s ingle factor, and m odels posit in g

 b o t h

 unco rrelated an d correlated

factors .  T a b l e  11.4 presents

  fit

  s tatis t ics

  for

  these models

  fo r

  these data.

The s ta t is t ica l s ignif icance tes t for the b ase l ine independence model

pos i t ing  no  factors results  in  re jec t ion  of the  nu l l hypothes is  t ha t  the  mode l

fits.

 N o t e

  t h a t  th e  ^ C A L C U L A T E D   v a l u e  for  this test  is  reported  in  scientif ic

nota tion as "1.45E-134."  This  means  t h a t  af ter  the decimal , the  p  is 133

zeroes  fol lowed by

  "145."

142

  EX PL O RA T O RY

 A N D

  CO N FJ RMA T O RY FA CT O R  ANALYSIS

Page 140: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 140/185

e1 ) e2

  ) (

  e3 j e4

/ V y V y

1

1

1

51-1

Per2 Per3

Per4

 

ServAff

  V

LibPlacd

1

~ n

Per5

i

 

r

P e r G

  I

 Per?

i I

^

  A ^

1 1 1

Per t

 

t

1

t

r

i

^-v.

e5 i e6 e7 e8

Figure

  11.2.

  C on f irmatory fac to r ana lys is M odel  #6   (Fac to rs  I an d II mu l t ivoca l throug h

P E R 8 ) .  ServAf f

  =

  "Service Affect" ; LibPlace

  =

  "Library

 as

  P lace. "

SOM E

  I N T E R P R E T A T I O N

 P R I N C I P L E S  1 4 3

Page 141: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 141/185

^ — ~

o

CC

LL

0

O

T

 —

II

C

co

CD

o

C D

.2

Q

 C D

73

0

-*•

 °

i- £

T C O

m

UJ  (T

C D

  "g

< CC

C O

CD

73

O

13

O

LL

L_

4

 —

C O

o

to

-2

C O

~

1 1

73

 ~

O

CD

C D

U)

^

U

C O

D C

~

O

1 1

^

s

73

% _ - »

• * * • •

P

 

L

 

U

L

 

T

E

D

:+.-

3

cy

^

73

O

i-

 CD

O CD

m co

0CD

O CD

O

 i-

OCD

0C

8%

OCD

00

C O C O

C O C M

in 

CM   T-

CO   C3) O

T TT

LU LU LU

in in K

^~  r o

^  CD en

00  O CO

C vJ   C M

in

i- N. f

m C D o o

•* m co

(M 00 CO

r -~ C M -< t

CD

o

I

n

O

n

a

o

D

i

e

e

o >

0

05

in

o

CO

CO

C 3>

 

i—

•*

c\ i

O5

CO

o

0

o

C)

o

O J

f>x_

CM

CO

^_^^

-n   C O

U

n

e

a

e

f

a

o

s

 

C

o

e

a

e

T—

CO

o

CM

CO

O

in

en

CD

in

CD

T—  .

in i-

co  o

C O   O

CO   O

o

 o

C D 0

O >

 i-

CD   T-

CO

  O5

• CD

COT

_

^

f

a

o

s

 

D

i

e

e

|j

LL

^

1

"o

 

if)

CD

O

Q

M

CO

OO

E

-Q

~ 5

o

OJ

.c

"(Q

1

 

03

ra

c

cO

0)

E

"5

O ,_ :

^

 =

•—

  c

gX

 |

£ "o

S0

1 |

CO   W

^ cz

-

 1

o j  "5

"O o

CD   *"~

_C  II

OD

 <

> LJJ

0 C/)

« CT

c . „

0) <U

Cj-

:•  —

CO ^

^ T

  ^

LLJ

 ~

2

- a

"-  E

.18

75

  ii

"5 JT

C 7^

u

 u

rt x"

A

o

e

I

n

s

e

n

m

e

n

144

E X P LO R A TO R Y   A N D  CO NF I R M A TO R Y F A CTO R A NA LYS IS

Page 142: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 142/185

The

  p

  is not

  e q u a l

  to

  zero,  regard less

 o f

  w h a t

  the

  s ta t i s t i ca l packa ge

m ay

  p r in t to r e la t ive ly f ew dec im al p laces . A common e r ro r in pub l i shed

l i t e ra tu re  is er roneously repor t ing that  p =  .000, wh ich means  th a t  the

researcher  impossibly  ob ta ined  an  impossible

  resul t

  (wh ich  is imposs ib le ) .

Such

  sm all values can b e acc ura tely repor ted as p < .001,  o r exac t va lues

can be

  derived using Excel spreadsheet functions such

  as:

= c h i d i s t ( 7 2 4 . 5 1 , 2 8 ) ,

which here y ie ld s p  =  "1 .45E-134 ."

For

  these  tes ts , Model  4  provides reasonable  fit.  Th is  is  tr u e n o tw i th -

s tand ing

  th e  p

  value ( .03685) associated with

  th e

  tes t .

  A s

  noted

  in

  chap te r

10, the  p  values  fo r  m o d e l  fit  tests  are inflated by the  large sam ple  sizes

usually  used

  in

  CF A ,

  as are

  s ta t is t ical s ignif icance tes ts

  in  o ther

  contexts

( T homp son ,

  1996) .

The  s ignif icance tes t  is  more

  useful

  in  look ing  at  comparative fit  of

nested models . In th i s example , Model  4 posi ted correla ted  factors, e s t im a te d

17  paramete r s  (8  pat tern coeff ic ien ts , 8  er ror var iances ,  and 1  covar iance) ,

and  reta ined  19 (36 - 17) degrees o f fr eedom. Model  3  posi ted uncorrela ted

factors ,  es t imated  1 6 paramete r s  (8  pat tern coeff ic ien ts , 8  e r ro r var iances ) ,

and

  reta ined  20 (36 - 16) degrees of f re e d o m .

T he

  models

  are

 nes ted because Model

  3 is

 Model

 4

 w i th

 one

  p a r a m e te r

( a covar iance) r emoved .  Th i s  m e a n s t h a t

  th e

  differences

  in the

  m o d e l

 fi ts

c a n b e  q u a n t i f i e d b y s u b t r a c ti n g the %

:

 va lu es (48 .27  -  31.36  =  16.91)  an d

the

  corresponding degrees

  o f

 f reedom

  ( 2 0 - 19 = 1 ) . Th e

  i m p r o v e m e n t

  in

Model 4 over Model 3 is

 s ta t i s t ical ly

  s ignif icant (p =

  .00004) .

H I G H E R- ORD E R M OD E L S

A s

 exp la ined

  in

 chap te r

  6, in EFA

 h ig her -o rder fac to r s

 can b e

 ex t r ac ted

where lower -order factors

  a re

 correla ted .  G i v e n

  the

  G L M ,

  the

  same pr inci -

ples

  a p p ly

 i n

  CFA.

  One  di f ference  in the

  analyses , however ,

  is th a t

  h igher -

order factors

  in CFA a re

  e x t r a c t e d w i th

  th e

  factor covar iances /correla t ions

f ixed to b e

  zeroes.

  In CFA

  h igher -order analys is ,

 t he

  factor covariances

  are

posi ted

  to he

  accounted

  for by the

  h igher -order factors .

Three First-Order  Factors

Figure  11.3 presents a h igher -order m odel invo lv ing 12 measured var i -

ables  and 3 f i r s t -order factors for the data provided by the 100   faculty.  In

C FA

  h igher -o rder ana lyses , usual ly  there

  are

 th ree

 or

 more

 first-orde r

 factors

so tha t

  th e

  h igher -o rder paramete r s wi l l  become ident if ied (see Byrne, 1998,

p p .  1 7 0 - 1 7 3 ) .

SOME

  INTERPRETATION PRINCIPLES

  145

Page 143: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 143/185

N   /

e2 ) ( e3 ( e4

Per5

Per6

 

Per?

_J

i

]1

 

J L

.

 

Per8

 

ti

i

^_L.^

ti

e5 ( e6 e7 e8 )

Figure 11.3.

  Second-order  model wi th three f i rs t-order factors. ServAff

  =

  "Service

A ffec t " ;

  LibPlace = "Library as Place";

  InfoAcc

  = "Information

  A cces s . "

The m odel g raph ica l ly po r t r aye d in F igu re 11 .3 has 78 deg rees o f

f reedom

  ( [12 x (12 + 1) ] / 2) . The model expends 27 of the 78 degrees of

f reedom   to es t im ate 12 er ror var ianc es of the me asured var iab les , 9 f i rs t -

o rder f ac to r pa t te rn   coefficients ,  3 var i anc es of the f i r st -order  factors,  and

3

  second -o rder f ac to r pa t te rn coef f ic ien ts

  (12 + 9 + 3 + 3 =

  2 7 ) . Three

first-order pattern  coefficients were  fixed to  equal  Is , and the  second-order

fac tor ' s  v a r i a n c e  was f ixed to  equa l  1 .

146

E X P L O R A T O R Y   A ND CO NF I R M A TO R Y  F A C T O R A N A L Y S I S

Page 144: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 144/185

Table  11.5 presents

  th e

  p a r a m e t e t e s t i m a t e s

  and f i t

  s ta t i s t ics

  fo r

  th is

analys i s . As

  b e f o r e,

 note

 that  pa t t e rn coef f i c i en t s

 fixed to

  equa l ze ro

 do not

i m p l y

  that  co r respond ing s t ruc tu re

 o r

  o ther co r re l a t ions

  a re

  also zero .

Note that  these r esu l t s sugges t

  a

  r easonab le model

  fit.

  These  three

latent

  v a r i a b l e s

  d o

  m e a s u r e

  a

  second -o rder f ac to r .

  But the

  " In fo rmat ion

Access"

 first-order

 fac to r

 has the

  h ighes t co r re l a t ion wi th

  the

  second -o rder

f ac t o r

  (r =  .990).  The f i rs t -order  factor , "Library  as  Place,"  has the  lowes t

of

  th e  three

  co r re l a t ions wi th

  th e

  second-order factor

  (r =

  . 6 0 2 ) .

  This

a n a ly s i s  reflects

  the

  r ichness

  o f

  s e c o n d - o r d e r f a c t o r a n a l y s i s

 and i t s

  capac i ty

t o m o d e l c om p l e x h i e r a r c h i c a l d y n a m i c s .

Two  First Order Factors

A

  second-order

  CFA

  c o r r e s p o n d i n g

  to

  M o d e l

  #4 can be

  execu ted ,

  if

addi t ional const rain ts

  are

  imposed

  so

  that

  the

  model becomes

  iden t i f ied ,

because  there

  are

  on ly

  two first-order

  factors .

  The

  constraint  i l lus t ra ted

  in

Figure   11.4  is  impos ing  a  res t r ic t ion  that  the two  second -o rder p a t t e r n

coef f ic ien ts  for the f i rs t -order  fac to r s  a re to be  es t imated ,  b u t  that these

es t imates m us t be equa l . In SEM/CFA s o f t w a r e , s u c h r e s t r i c t i o n s a r e c o m m u -

n ica t ed b y u s ing a s ing le l e t t e r fo r two o r more p aram ete r s cons t r a ined to

b e e q u a l .

Table

  11.6 presents

 the

  paramete r es t imates

 and the fit

 s ta t i s t ics associ-

a t ed wi th th i s mode l . No te  that  the f i t  s ta t i s t ics repor ted  in Table  11.6  fo r

t h i s second -o rder m ode l

  are

  iden t i ca l

  to the fit

  s ta t i s t ics repor ted

  in Table

11.3   for the  correlated  f i rs t -order  fac to r mode l , Model  #4 .

In

 CFA, h igher -o rd er f ac to r ana ly s i s r eexp resses

 the

  correlat ions amo ng

the f i r s t - o rde r

  fac to r s

  b y

  m o d e l i n g

  these

  d y n a m i c s

  in an

  a l t e rna t ive model .

H o w e v e r ,  the fits of these h igh er -o rder mode l s  and the  co r respond ing  first-

order m odels  are the  s a m e , w h e n t h e r e  a re  f ew er  than  four f i r s t - o rde r fac to r s

fo r

  a g iven second -o rder f ac to r . N ever the l ess , t he h igh er -o rd er  ana lys i s  does

p rov ide add i t iona l i n s igh t , because on ly in th i s mode l a re the co r re l a t ions

of

  m e a s u r e d v a r i a b l e s

  and f i r s t -order

  f a c t o r s w i t h h i g h e r - o r d e r c o n s t r u c ts

q u a n t i f i e d .

M A J O R

  CON CEPTS

The decis ion of how to

  iden t i fy

  a fac to r mode l i s a rb i t r a ry and does

no t impa c t f i t s t a t i s ti cs . Ho w eve r , s ty l i s t i ca l ly , mode l

  id e n t i f i c a t ion

  v i a f ix ing

f a c to r  var i ances

  to 1 has

 s o m e a d v a n t a g e s , s u c h

 a s

 m a k i ng f ac to r  c o v a r i a n c e s

a l so equ a l f ac to r co r re l a t ions

 an d

  a l l o w i n g c o m p a r i s o n s

 o f

 e s t i m a t e d p a t t e r n

coef f ic ien ts

  f o r a l l t h e m e a s u r e d v a r i a b l e s o n a g i v e n f a c t o r .

SOME

  IN T E R P R E T A T IO N  PRINCIPLES  147

Page 145: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 145/185

C O

CD

.a

C O

-

i_

C O

"0

C D

C O

C O

C D

>

CM

J

C O

  c o

^*»

C O

^

<

0

C O

LL

C D

E

95

LO  r ^

•r

 OO

i- OCO

CD   LL

03

 o°

<M

 1

~

1—   CO II

0c-

'•*—

  viz*

C O

'

C O

c / 5

II

T3

C

C O

£ 2

"c

CD

]o

  C D

o

O

o

^

2

JD

  co

a.

0

0

C O

C O  ^

C D

O "o *"•

III

o CD c j

^ E

CO

i_

C D

O

1

j

 i

U_

(ii

ii

^ _  «

o

CO

~

E

Q J

^

O

a.

^

~ C D

S-S

(0  .«

ca

  ^o

C D

  C O

  i~

  ^ ~ T~ " C D i" ~ C ^ T— O ) C

LO

 i

 —

  CO  i —

 CD

 ^~ tOCO

f**-

 C3 L

c o C D ^~ c o L O LO L O ^~ L O   r*--  c

0000O

O h

LO

  C

^ r^ oo CD

3

  r - ~ -  o LO

D C

f

D -5

0O0000C

i- LO LO CD CD CO C

*- i— C D   a

00000

LO C

CO Cl

15   O O3 CO

D O ) C D C O

C O C O C M C O C 5

00O0O

-  CM <M

CvJ

 0

J CO O) CO O)

O 5 C O C D 0 0 C O

OOOOO

o o o o o o c

CO

  C M 00 C

O CO CO T

T CD U

f

  CM C

T

  U

D

  0

CO

  0) h- CO t CO C

OO0000C

O O •* K

CO

 CD T

oo

  a) co en o> LO c

1

co   co co co ^- ^i- a

o o o o o o c

C l

I

h

U

D   C D C

- i- «

• >   r - ~ «

5 C

3  a

D   ^

d d d c

•* 0

CO

CO

0)

d

J C M C M

•< t

  co a> co

O ) C O C D C O

C O C M C O

o co en

0) o- r--

ddd

 

I

I

 

O

C3

OO

i- CM CO ^J- LO

D C D C D C D C D C

LU  U

J LU LU LU

C L   D _ Q . C L C L

O i- O

D C   DC DC DC  £ E 5

LU

  LU LU U

J LU UJ  U

Q . D _ C L C L C L D _ Q

T3

O

O

CD

CO

CD   CM O

O5  OCT>

CD CD CD

odd

CD  CM O

0 ) 0 0 )

CD  CD O)

^ i- LO CM 00 O

O LO CO CM h~ CD

rn  O5 O> O

  CD

000

LU

«

LL

 2

O  D C

LU

CO

05

 

E  o

2'15

148

EXPL ORAT ORY  AND  CONFIRMAT ORY  F A C T O R

  ANALYSIS

Page 146: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 146/185

© 1   ) © 2 © 3 04

ServAffHH  v1

Pe

 

—   i

> r 5

t

1

Per6

Pe

i

\\1

 

1

i

  |

Per8

e5 e6

  )

  e7 e8

  gure

  n.4. Second-order model (Sec-O rder) with two first-order

  factors.  S e r v A f f :

  Service  Affect ;  LibPlace

  =

  Library

  a s

  Place.

S O M E  INTERPRETATION  P R I N C I P L E S  149

Page 147: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 147/185

12

TESTING MODEL INVA RIAN CE

The

  sine  qua non

  of

  research

  is  f inding  effects

  that  repl icate under

stated  conditions. When  factor analytic results replicate across samples  or

analytic decisions, the results are said to be  invariant.  Confirmatory  factor

analysis (CFA) is ideally suited for testing invariance, because CF A allows

constraining  different  types  of  result equalit ies (e.g., invariance  of pattern

coefficients,

  invariance

  of

 factor correlations)

  in

 different  models

 and

 quant i -

fies the

  degrees

  of

  mo del invarian ce across these variation s.

Here three sets o f invariance analyses a re reported using t he  Appendix

A   da ta .  The  examples involve  all 12  measured variables  and the  graduate

s tudent

  (HI =

  100)

  and

  faculty

  (n^ =

  100) data.  These  analyses illustrate

testing factor invariance across  different  respondent groups. Al ternat ives

are equally plausible, such as investigating invariance over random ly created

subgroups,

  across data

 collected

  from

  the

  same people over t ime,

  or

  across

the same role group but  different  geographic origins (e.g., s tudents  from

dif feren t

  regions of the country) .

SAME MODEL, DIFFERENT

  ESTIMATES

The

  least restrictive test

  of

  factor invariance presumes only  that

  the

same model

 fits in

 different  groups

 b ut

 that

  the

  parameters estimated

  in the

different  groups are independently estimated in  each  group. Figure 12.1

specifies

  an input model for such an analysis.

153

Page 148: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 148/185

e1

  e2 e3 e4

e9 e10

 6 1 2

1

 

r

Perl

 

1

 

Per2

V

\

Pe

t

1

r

  1

I

> r3

1

r

Per4

 

Pe

 

r

;r9

,—

--.

1

t  i

1

r

  i

Perl 6

  Peril

] 1 _ _

 

^er1S

(LibPlacd

1

x\

e5

  e6 e7 e8

F/gure

  12.1.

  Confirmatory factor analysis factor model with

  1 2

  measured variables

and 3

  correlated factors with factor variances constrained

  to

  equal 1.0. ServAff

 =

"Service

  Affect";

  InfoAcc =  "Information Access"; and  LibPlace  = "Library  as  Place."

In

  this analysis

 in both

 groups

 there  are 78

 degrees

 of

 freedom,

  because

there

  are 12

 measured variables ([1 2

 x  (12 + 1 )] / 2). The  specified

  model

posits

  that

  each of three correlated factors, with

  factor

  variances con-

s t rained

  to equal Is, are estim ated . Parameters  freed  to be est imated in both

groups are 12

 factor pattern  coefficients,

  1 2

 error variance s

 for the

  measured

variables,

  and 3

  factor covariances/correlations

  (12 + 12 + 3 =

  2 7 ) .

154

EXPLORATORY

  AN D

  CONFIRM ATOR Y FACTOR  ANALYSIS

Page 149: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 149/185

Thus,

  there

 are in both  groups  51 (78 - 27)  unspent degrees  of f r e e d om ,

for

  a

  t o t a l

  of 102 (2 X 51)

  avai lable degrees

  of

  freedom  after  paramete r

es t imat ion .

Table

  12 .1 presen ts

  the

  s tandard ized fac tor pa t te rn

  coefficients ,  the

factor  s t ruc tu re

 coefficients ,

  the factor covariances (here also correlations,

because

  the

  fac tor var iances

  are all

  fixed

  to

  equal

  1), and fi t

  statistics

  for

the  analysis. The fit  s ta t is t ics suggest that the  m od e l is a plaus ib le fit to the

data provided by both  groups.

There are some  differences  in the paramete rs es t imated ind ependen t ly

across  the two  groups. For  example , the  i tem PER3 ("Employees who  d e a l

with users in a caring fashion") is more correlated with the "Service  Affect"

factor for

  graduate s tudents (r

s

2

  =

  .835

2

  =

  69.7%)

  than  fo r

  faculty  (r

s

2

  =

.692 = 47.9% ). The

  d i f ference

  in the squared s tructure

 coefficients ,

  wh i ch

can be

  directly compared because squaring

 has

 been  invoked,

 is

 no tewor thy .

S tu d e n t s  def ine

  affect  of

  service

  in

  l ibraries more tha n  faculty

  in

  t e rms

  of

being deal t wi th

  in a

  caring fashion

  by  library  staff.

Con versely , the i tem PER6 ("A m edi t a t ive place") is more correla ted

wi th   the  "Library  as  Place" factor  for  faculty  (r

s

2

  =  .908

Z

  =  82 .4%)

  than

for

  s tuden ts  (r

2

  =

  .753

2

  =

  56.7%).

  In

  this case,  faculty  percept ions

  of the

library

  are more correla ted wi th a quie t and contemplat ive a tmosphere.

The  factor correla t ions  are  reasonably similar. The  biggest  differences

occur on the correla tion of "Service Af fect" with "Library as Place" for

gradua te

  s tuden ts

  (r |

  x

  u =

  .569

  =

  32 .4%) versus facul ty

  (r j

  x

  \\ =

.418

2

  =

  1 7 . 5 % ) .

Nevertheless ,

  the

  single model

 fit

  reasonably well

 in both

  groups,

 and

the  parameter es t imates were reasonably s imilar  across groups. Furthe rmo re,

the

  detected differences seem reasonably

  to

  reflect  n a tu ra l

  differences  in

needs  and

  interests

  of the two

  g roups

 o f

  l ibrary  users.

SAME

  MODEL,

  EQUAL ESTIMATES

Figure  1 2.2 presents a more con strained test of factor inv arianc e across

the two groups. In this case the three un s tan dard ized factor pat tern

  coeffi-

cien ts

  are

 con strained

  to

  equal

 Is. The

  remain ing nine  pa t te rn

  coefficients

are

  freed

  to be

  es t imated ,

  but

  w i t h

  the

  constrain t  that

  the set of

  nine under-

standardized  estimates  mus t  be  equal  across  the two  groups, as  reflected  by the

use of

 le t ters wi t hin

 the

  d rawing

 of the

  mode l .

 Also

 es t imated independen t ly

in the two g roups are 12 error variances for the measured variables , the

variances

  of the 3

  factors,

  and the 3

  factor covariances .

Thus,  from

  the

  avai lable

 total  of 156

  degrees

  of

  freedom

  (78 X 2), a

total  of 45 (9 + 2 [12 + 3 + 3] = 9 + 2

  [18]

  = 9 + 36)

  degrees

  of

 f reedom

are spent on parameter e s t imation , leaving 1 11 degrees of freedom un spen t .

TESTING  MODEL  INVARIANCE  1 5 5

Page 150: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 150/185

TABLE  12.1

Factor

  Pattern/Structure

 Coefficients and Fit

 Statistics

 for

First-Order

  Factor

  Analysis With  12

 Measured Variables

(n, = 100 Graduate  Students; n

2

 = 100  Faculty)

Group/

variable/

statistic

Graduate

PER1

PER2

PER3

PER4

PER5

PER6

PER7

PER8

PER9

PER10

PER11

PER12

O x I I

r

\  x III

1 XIII

Faculty

PER1

PER2

PER3

PER4

PER5

PER6

PER?

PER8

PER9

PER10

PER11

PER12

Ixll

  XIII

1 XIII

f

df

f/df

NFI

CFI

RMSEA

Pattern

  II

students

0.945  —

0.886 —

0.835 —

0.822  —

— 0.907

— 0.753

— 0.890

—   0.834

— —

— —

— —

— —

0.569

0.642

0.567

0.944 —

0.882  —

0.692  —

0.882 —

—   0.933

—   0.908

—   0.932

—   0.798

— —

—   —

— —

  —

0.418

0.689

0.596

135.03

102

1.32

0.924

0.980

0.040

III  I

—   0.945

— 0.886

—   0.835

—   0.822

  0.516

—   0.429

— 0.507

—   0.475

0.827 0.531

0.739  0.474

0.813 0.522

0.613 0.393

—   0.944

—   0.882

—   0.692

— 0.882

  0.390

— 0.380

—   0.390

— 0.334

0.576  0.397

0.716

  0.493

0.663 0.456

0.480 0.331

Structure

II

0.538

0.504

0.475

0.468

0.907

0.753

0.890

0.834

0.469

0.419

0.461

0.347

0.395

0.369

0.290

0.369

0.933

0.908

0.932

0.798

0.343

0.426

0.395

0.286

III

0.607

0.569

0.536

0.528

0.514

0.427

0.505

0.473

0.827

0.739

0.813

0.613

0.650

0.607

0.477

0.608

0.556

0.541

0.555

0.475

0.576

0.716

0.663

0.480

Note. Parameter estimates "fixed"

 to be

  zeroes

 are

  reported

  as dashes

  ("—").

 All

  critical ratios

 > |2.0|.

NFI

  = normed fit index; CFI = comparative fit index; RMSEA = root-mean-square error of approximation.

1 5 6

EXPLORATORY   AND CONFIRMATORY

  FACTOR  ANALYSIS

Page 151: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 151/185

e-\J   ( e2 ) ( e3 ) ( e4 )

1

Perl

V ~ : y

1  n

e9)

  (e1o)

  ellj

  1612,,

Per2 Per3

1

J_

1 1

Per4

  Per9|  Perl6

  Peril

  ^er1

 

i I I

Per6

Per? Per8

Figure  12 2 Confirmatory factor analysis factor model with  12 measured variables

and 3

 correlated factors with equality constraints.

 ServAff

 =

 "Service

 Affect";

 InfoAcc

 =

"Information Access"; and LibPlace  = "Library as Place."

In

 other

  words the  equality  constraint  means

  that

  the  Figure 12.2 model

is

  more

  parsimonious  than  the  Figure 12.1 model  by a  factor  of 9  degrees

of

 freedom

  (111-102).

Table  12.2 presents  the  standardized factor  pattern  coefficients the

factor structure coefficients

the

  factor

 correlations, and fit

 statistics

 for the

TESTING  MODEL

  1NVARIANCE

  7

Page 152: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 152/185

TABLE

 12.2

Factor Pattern/Structure Coefficients and Fit Statistics for First-Order

Factor Analysis With Equality Constraints

 on

 Unstandardized Factor

Pattern  Coefficients   n - i  = 100 Graduate Students; n

2

 = 100

  Faculty)

Group/

variable/

statistic

Pattern

 

II

III

  I

Structure

II

III

Graduate students

PER1

PER2

PER3

PER4

PER5

PER6

PER7

PER8

PER9

PER10

PER11

PER12

Ixll

 

X

III

r \\

  x H I

Faculty

PER1

PER2

PER3

PER4

PER5

PER6

PER7

PER8

PER9

PER10

PER11

PER12

O x   I I

r

\ x

 III

  X

III

f

df

I

z

 df

NFI

CFI

R MS EA

(0.944)

  —

0.893 —

0.812

  —

0.834 —

— (0.907)

  0.789

— 0.884

— 0.816

— —

— —

— —

— —

0.568

0.641

0.565

(0.947)  —

0.870 —

0.735  —

0.874

  —

  (0.932)

  0.898

—  0.935

  0.814

— —

— —

— —

— —

0.422

0.699

0.603

144.47

111

1.30

0.919

0.980

0.039

  0.944

—  0.893

  0.812

—  0.834

—  0.515

  0.448

  0.502

—  0.463

0.825 0.529

(0.763)

  0.489

0.798 0.511

0.607 0.389

— 0.947

—  0.870

— 0.735

  0.874

  0.393

— 0.379

—  0.395

—  0.344

0.589 0.412

(0.658) 0.460

0.690 0.482

0.488

  0.341

0.536

0.507

0.461

0.474

0.907

0.789

0.884

0.816

0.466

0.431

0.451

0.343

0.400

0.367

0.310

0.369

0.932

0.898

0.935

0.814

0.355

0.397

0.416

0.294

0.605

0.572

0.520

0.535

0.513

0.446

0.500

0.461

0.825

0.763

0.798

0.607

0.662

0.608

0.514

0.611

0.562

0.542

0.564

0.491

0.589

0.658

0.690

0.488

Note.

  Parameter estimates

  "fixed"

 to be zeroes are reported as dashes ( — ). Parameter estimates

  "fixed"

to equal

 1.0 for the

 unstandardized model

 are

  reported

 in

 parentheses.

 All

 critical ratios

 >

 |2.0|.

 NFI =

normed

  fit  index; CFI = comparative  it index; RMSEA  =  root-mean-square error of approximation.

158

E X P L O R A T O R Y AND CONFIRMATORY

 FACTOR  ANALYSIS

Page 153: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 153/185

analysis .

  The fi t s tatistics suggest

 that

  the mod el is again a plau sible fi t to

the

  da ta p rov ided

  by

 both  groups.

The fit is not as

  good

  as the fit of the

  less con strained m odel ,

  but

never the less remains  plaus ib le .  Further res tr ic t ions on

  equal i t ies

  might be

imposed (e.g. , constraining

  factor

  covariances

  to be

  equal ) ,

  bu t

  such addi-

t ional con strain ts w ould necessi tate some reasons

  as to why  each  set of

add i t iona l paramete rs might  be  expected  to be  i n v a r i an t .

Al te rna t ive ly ,

  a

 sequence

  of

 equal i ty cons t ra in t s m ight

  be

  imposed

  to

detect

  w here, i f a t a l l , invariance breaks down across groups (Byrn e, 2001;

Joreskog

  &.

 Sorbom,  1993).

 The

  sequence

  of

  invariance tes ts wi th equal i ty

cons t ra in t s might

 proceed  as  follows:  (a )

  e q u a l i t y

 of

 uns tand ard ized pa t te rn

coefficients;  (b)

  equal i ty

  of

  u n s t an d a rd i ze d p a t t e rn

  coefficients  an d

  factor

covariances; (c) equal i ty of uns ta nd ardized pat ter n

  coefficients,

  factor covari-

ances,

  and

  factor variances;

  and (d)

  equal i ty

  of

  u n s t an d a rd i z e d p a t t e rn

coefficients,

  factor covariances , factor varianc es ,

 an d

  e r ror var iances

 of

 mea-

sured

  variables.

SAME  MODEL,  TEST  INVARIANCE

  OF

  PARAMETERS

The ul t ima te tes t of mod el in varian ce is us ing some or all of the  specific

parameter es t imates

  from  on e

  sample

  in a

  model tes t wi th

  an

  i n d e p e n d e n t

sample. Here

  the

  uns tand ard ized pa t te rn

 coeff icients  for the

  model es t imated

wi th g rad ua te s tuden ts were

  f i t to the

  da ta p rov ided

 by the

  facul ty .

  Figure

12.3

  presen ts

  the

  inpu t mode l .

Table  12 .3 presen ts  the  s tandardized factor pat tern  coefficients,  the

factor

  s t ruc tu re

  coefficients,

  the factor correlations, and fi t s tatistics for the

analysis .

  The f i t s ta t is t ics

  suggest

  tha t the model is near the bou nd ary of

plausible

  f i t to the data provided by the

  facul ty .

  Ne ver th e less , the deg ree

of  fit is

  s t r ik ing g iven

  that

  selected

  specific

  parameters  from

  the

  g radua te

s tudents were used

  in the

  tes ted model .

In

  pract ice , researchers  usually  hope  t ha t

  the

  same cons t ruc ts

  are

measured

  across group s but  usually  do not expect factor pat tern

  coeff icients

to be

  completely invarian t . Nevertheless , such tes ts i l lus tra te

 the

  power

 of

CFA to find the

  boundar ies

 of

  resul t invariance.

MAJOR  CONCEPTS

Science  i s abou t i so la t ing and und ers tan d ing la ten t cons t ruc ts. We

seek con structs  that  repl icate unders ta ted condi t ions  and that  are useful  for

stated purposes.

  In

  other  words ,

  w e

  seek factors  that

  are

  invar ian t over

samples and analyses .

TESTING

  MODEL INVARIANCE

  159

Page 154: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 154/185

e 1 ) e 2 ) 6 3 e 4

1

  e9j

  (elQ)  ( e11J e12

Per9  RerlCJ  eril

v

(LibPlacq

1

Per5

  Per6  Per?

65 ) 66 67 68

Figure 12.3.  Confirmatory  factor  analysis  factor  model

  with 12

 measured variables

and 3 correlated   factors

  with

  1 2 unstandardized pattern  coefficients  from  graduate

students constrained as  fixed.  ServAff  = Service  Affect";

  InfoAcc

  = Information

Access";

  and

 LibPlace

 =

  "Library

 as

 Place.

For exam ple, do w e obtain the same or simila r factors across ind epen -

dent samples of people? Do we obtain similar factors when we analyze two

similar but  different

  sets

 of

 measured

  variables? Do we

 obtain similar factors

when we invoke  different  estimation theories?

CFA is ideally suite d to tes tin g invarian ce propositions, because of its

flexibility

  and

  capacity

  to  quantify

  degrees

  of

 model fi t .

 When

  rival models

for

  a  given da ta  set are  nested,  we can  quant i fy  the  degree  to  which more

160 EXPLORATORY   A ND  CONFIRMAT ORY

  FACTOR

  ANALYSIS

Page 155: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 155/185

TABLE 12.3

Factor

 Pattern/Structure Coefficients and Fit Statistics for  First-Order

Factor Analysis for Faculty Data  n = 100) Fitting Graduate Students'

Unstandardized Factor Pattern Coefficients

Variable/

statistic

PER1

PER2

PER3

PER4

PER5

PER6

PER7

PER8

PER9

PER10

PER11

PER12

1.x II

 

XIII

1l x III

x

2

df

f/df

NFI

CFI

RMSEA

Pattern

  II

[0.966] —

[0.902] —

[0.816]

  —

[0.901] —

— [0.917]

—   [0.837]

— [0.929]

— [0.801]

— —

— —

— —

— —

0.460

0.783

0.575

106.71

63

1.69

0.879

0.946

0.084

III

  I

— 0.966

— 0.902

  0.816

—   0.901

— 0.422

  0.385

—   0.428

—   0.369

[0.708] 0.554

[0.753] 0.590

[0.808] 0.632

[0.613] 0.480

Structure

II

0.445

0.415

0.375

0.415

0.917

0.837

0.929

0.801

0.407

0.433

0.464

0.352

III

0.756

0.706

0.638

0.705

0.527

0.481

0.534

0.461

0.708

0.753

0.808

0.613

Note.  Parameter estimates fixed to be zeroes are reported as dashes ( — ). Parameter estimates fixed

to

 equal

 unstandardized pattern coefficients for the graduate students  are reported  in brackets. A ll critical

ratios > |2.0|. NFI = normed fit index; CFI = comparative fit index; RMSEA = root-mean-square error of

approximation. 78-(12

 + 13)

  freed estimates

 = 36.

parsimonious models estimating

 fewer

  parameters

  fit

  less  wel l ,

  as noted  in

chapter

 11. Remember that  less parsimonious models (i.e., models estimating

more parameters) fit as well as or better  than  more parsimonious models,

until  at the  extreme  w e  estimate  all  possible parameters (i.e.,  df =  0),

at

  which

  point

  every

  j u s t - i den t i f i ed

  model

  with

  zero degrees

  of

  freedom

fits perfectly.

Because

  % s for nested models  are  additive,  we can  subtract  the %

2

 for

th e

  less parsimonious model (e.g.,

  %

2

 =

 31.36,  df

  = 19)  from  the %

2

 for the

more parsimonious model (e.g., Jj = 48.27,

  df

  = 20) to  quantify h ow

  much

better

  the

  less parsimonious model

  fits

  (e.g.,

 

2

 48.27

  -

  31.36

  =

  16.91).

Because degrees of freedom

  from

  nested models are additive, we can also

subtract  the  degrees of freedom (e.g., 20 - 19 = 1). This allows calculation

of

  th e  p

  value (e.g.,

  p =

 .00004

  fo r  f =

  16.91 with

  df  = 1) for the

  null

hypothesis  that

  the

  nested models

 fit

 equally

 wel l .

TESTING MODEL

 INVARIANCE

1 6 1

Page 156: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 156/185

In   CFA,  we can fi t  s imi la r  but  d i f ferent  models  to  d i f feren t  types o f

samples  or to the  same sample over  time.  We can fit the  same models  to

different  samples but  free  the parameters to be est imated independent ly in

each

  data set . Or we can fit the  same model and  also use some or all of the

est imates

  in one

  sample with

  the

  da ta

  from

  an

  independent sample (see

Thompson

 et

  a l . , 2003) . Analyses

 of the

  last kind

 are the

  u l t ima te

  in

  tests

of  invar iance ,

  and the

  u l t im a t e

 in

  parsimony, because

  at the

  ex t reme

  we

are es t ima t ing no pa rameter s wi th new da ta when w e fi t only the parameters

from

  a

  prior study.

] 62

  EXPLORATORY

 AND

 CONFIRMAT ORY FACTOR  ANALYSIS

Page 157: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 157/185

A P P E N D I X

  A

LibQUAL+™ Da t a

0 0 1 2 8 7 5 5 3 2 3 3 6 5 5 7

002 2 5 7 5 5 4 5 5 6 3 4 3 6

0 0 3

 2 6 5 5 6 5 3 5 3 3 5 4 7

0 0 4 2 5 5 4 6 4 4 4 4 2 1 5 4

0 0 5

 2 5 5 5 5 5 4 6 2 4 4 5 7

006 2 7 7 7 8 7 8 7 6 6 8 7 5

007

 2 8 8 7 7 6 7 6 7 6 5 6 5

008 2 7 7 7 7 5 3 3 6 5 4 5 8

  9 2 1 3 1 1 1 1 1 1 1 1 1 1

0 1 0

 2 9 9 9 8 5 7 5 2 7 6 7 7

  1 1   2 9 9 9 7 7 9 9 7 4 6 5 9

  1 2  2 4 3 3 5 6 5 4 5 2 5 1 4

013

 2 6 6 5 7 7 7 7 7 6 5 6 7

0 1 4

 2 7 7 2 7 2 3 2 3 6 6 5 9

0 1 5 2 9 9 9 9 7 5 4 5 4 1 1 4

016

 2 9 9 9 9 5 5 5 7 6 7 6 6

017

 28899766868

 9 9

0 1 8

 2 8 5 8 5 4 8

 5

 8 4 3 4 7

019 2 9 9 9 9 6 7 6 6 7 9 9 9

0 2 0 2 8 7 7 9 5 6 6 4 5 6 2 9

021 2 9 9 8 9 9 9 9 8 8 6 6 7

022

 2 7 7 7 8 7 6 7 6 7 8 8 7

023

 2 7 8 7 6 5 6 5 6 6 5 7 7

0 2 4 2 8 9 7 8 5 3 4 5 9 9 7 7

025 2 8 8 8 8 6 6 6 7 9 6 9 9

026

 2 8 9 9 9 6 6 6 6 8 8 9 8

027

  2 8 8 8 8 8 7 8 8 8 9 8 7

028

 2 9 9 9 7 6 7 7 6 9 9 6 9

0 2 9

 2 6 5 6 5 5 6 6 5 4 5 4 5

0 30 2 8 8 7 8 7 8 8 7 8 7 6 7

031 2 7 7 7 8 8 7 8 7 7 6 8 8

  3 2 2 1 1 2 1 1 1 1 1 6 6 5 9

0 33 2 8 8 8 8 8 7 7 8 6 1 6 7

0 34

 2 8 8 6 7 5 6 5 6 8 9 8 7

035

 2 8 8 8 5 7 4 8 5 6 6 6 7

0 36 2 8 8 8 8 5 5 6 5 7 7 7 8

0 3 7 2 6 4 5 6 7 7 7 8 5 5 4 4

1 6 3

Page 158: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 158/185

0 3 8

 2 5 4 5 3 3 4 3 4 6 7 3 3

0 3 9

 2 5 7 6 7 7 5 6 7 6 6 5 5

0 40  7 7 7 8 6 7 7 7 7 6 8 8

0 41 2 6 8 7 7 8 7 7 7 7 9 7 7

0 42

 2 7 8 8 5 6 6 6 6 6 5 5 6

0 43 2 8 8 7 7 5 6 5 6 6 7 6 7

0 4 4 2 7 7 5

 7 4 5 4 4 8 7 6 8

045 2 9 9 9 9 8 9 8 8 9 7 7 9

046 2 7 7 7 7 7 7 7 7 7 7 7 1

047 2 7 5 7 6 4 7 5 4 7 9 6 7

0 4 8

 2 1 3 3 1 4 5 4 4 9 6 7 7

0 49 2 7 8 6 7 8 8 8 8 7 6 7 7

0 5 0 2 8 8 7 8 8 7 7 8 7 8 7 7

051

 2988864686767

052

 2 9 9 8 9 8 7 8 7 8 7 9 9

053 2 6 6 5 6 8 4 6 7 7 7 6 7

0 5 4

 2 8 8 8 4 5 4 5 8 7 9 6 7

0 55

 2 8 8 7 9 2 2 2 1 5 2 6 8

0 5 6

 2 8 8 6 8 3 3 5 6 6 5 4 7

057

 2 9 9 9 9 2 2 2 2 8 9 9 9

058 2 8 8 8 8 7 7 8 8 8 8 8 8

059

 2 6 5 6 7 6 7 6 7 5 6 6 7

0 6 0

 2 8 9 7 7 6 5 6 6 7 7 6 7

0 6 1

 2 7 7 7 7 7 7 7 8 7 7 7 7

0 6 2 2 6 7 7 6 4 4 5 7 6 3 7 3

063

 2 8 8 7 8 8 6 7 9 7 7 7 8

064

 2 8 7 8 8 4 4 4 3 8 2 6 4

0 65 2 8 9 8 7 7 7 7 7 6 6 7 7

066 2 8 8 9 8 8 8 8 8 7 9 6 7

0 6 7 2 7 7 7 7 7 7 6 8 7 6 7 8

0 6 8 2 6 3 7 9 4 3 3 3 9 4 5 5

069

 2 9 8 8 9 8 8 7 8 6 7 6 9

0 7 0 2 6 6 6 5 4 4 4 4 5 256

0 71

 2 8 8 9 8 7 7 7 7 8 6 7 8

0 7 2 2 8 8 8 8 7 6 8 7 7 7 7 8

0 73 2 9 9 7 8 4 5 6 7 6 7 6 7

0 7 4 2 8 8 8 8 6

 6

 6 6 8 8 8 8

075

  2 8 8 8 7 8 7 8 7 6 8 8 9

076 2 8 8 7 8 4 5 6 8 8 5 6 7

0 77

 2 6 6 6 6 7 7 5 6 6 5 3 7

0 7 8

 2 7 5 5 4 3 5 5 6 2 2 6 8

079

 2 4 4 5 8 7 8 8 8 6 4 6 9

0 8 0

 2 9 9 8 7 8 1

 5

 99699

  64  APPENDIX A

Page 159: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 159/185

0 8 1

 2 8 5 8 7 3 1 6 1 7 7 6 9

0 82 2 9 7 7 9 9 8 7 9 8 8 8 7

0 83 2 8 8 8 8 6 4 7 7 7 9 6 6

0 8 4

 2 7 7 7 7 6 6 6 7 6 7 6 8

  8 5 2 4 2 3 4 1 4 1 1 1 1 1 1

0 86

 2 6 7 5 6 5 6 6 6 5 6 6 7

0 87

 2 6 6 7 7 8 5 6 9 6 7 6 7

0 8 8

 2 8 8 7 8 6 5 7 7 8 5 6 9

089 2 9 9 9 9 9 6 9 9 8 9 9 7

090 2 5 5 5 5 3 3 44.7 5 5 7

091 2 8 8 8 8 7 7 7 7 6 9 7 9

092 2 5 5 4 5 7 7 7 7 7 6 7 5

093

  2 8 8 8 8 6 6 6 6 4 4 6 7

094

 2 8 8 9 8 8 6 8 8 9 5 7 9

0 95

 2 8 7 7 7 8 6 7 6 7 7 7 8

0 96 2 8 8 8 8 7 7 7 6 7 5 7 8

0 97 2 7 8 7 8 7 5 8 5 7 8 7 7

0 9 8 2 4 5 2 1 5 1 2 2 1 2 6 2

0 99

 2 8 8 1 8 3 5 8 3 4 2 1 7

100

 2 9 8 8 9 7 4 9 8 9 8 9 9

1 1  3 7 7 7 7 6 7 6 6 7 6 6 8

1 2  3

 8 8 6 5 4 4 4 7 6 7 5 8

1 3

 3 5 5 5 5 5 7 5 6 4 5 5 5

104 3 9 9 9 9 6 6 6 7 5 6 9 7

1 5

 3 9 9 8 8 5 3 5 6 8 6 6 8

106 3 6 7 5 7 6 5 5 6 6 7 8 4

1 07 3 8 6 9 7 6 8 8 7 5 5 6 7

108 3 8 7 8 8 6 4 6 7 6 5 6 7

1 0 9 3 6 6 6 5 5 5 5 4 6 7 5 8

1 1 0 3 8 7 7 8 6 8 4 6 6 8 7 7

  3 9 8 9 9 9 9 9 9 8 5 8 9

1 1 2

 3 7 7 8 7 6 8 7 5 6 6 7 5

1 1 3  3 7 7 7 6 6 6 6 6 4 6 7 6

1 1 4

 3 7 7 7 8 1 2 2 1 7 7 8 3

1 1 5 3 7 7 8 7 7 7 7 7 7 5 64

1 1 6 3 8 9 8 8 6 5 6 9 4 3 5 7

1 1 7 3 5 4 5 5 7 7 5 5 1 3 5 3

1 1 8

 3 9 9 8 9 7 7 7 9 7 6 8 9

1 1 9 3 8 8 8 8 7 7 7 8 6 7 8 8

1 2

3 7 6 8 8 7 6 7 6 7 8 7 5

1 2 1  3 8 8 8 9 8 8 8 8 4 7 6 7

1 2 2  3 8 7 8 8 5 4 6 5 6 7 6 7

1 2 3  3 5 3 4 5 2 2 2 2 2 2 2 7

APPE NDIX A 165

Page 160: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 160/185

1 2 4 3 9 9 9 9 8 8 8 8 6 7 6 9

1 2 5 3 88 7 93 3 3 5 4 5 68

1263 999998896 7 8 8

1 2 7

 3 9 9 8 9 8 8 8 8 8 7 8 9

1 2 8

 3 9 9 7 9 7 7 7 6 8 7 8 7

1 2 9  3 3 3 7 3 7 5 6 3 7 3 6 6

1 30

 3 9 9 8 9 6 7 7 7 7 7 8 8

1 3 1  3 8 8 6 7 3 5 4 5 8 6 8 8

1 3 2

  5 5 5 5 2 3 3 2 6 6 5 5

1 3 3

 3 7 6 8 8 5 5 5 6 5 4 5 8

1 3 4 3 8 8 9 8 6 6 6 7 8 7 8 8

1 3 5 3 8 8 9 8 8 9 8 8 7 3 7 7

1 3 6  3 7 6 8 8 4 4 4 6 8 6 7 8

1 3 7  3 8 8 8 8 2 2 2 2 1

  1 6 4

1 3 8

 3 8 7 8 9 6 7 5 7 6 8 6 6

139 3 7 8 8 7 6 7 6 7 4 8 8 9

140 3 9 9 7 8 7 8 7 8 5 6 8 8

1 4 1

 3 8 7 7 8 6 7 8 7 7 6 6 7

1 4 2

 3 8 7 6 7 6 6 6 6 7 2 5 8

1 4 3

 3 7 7 7 5 3 2 5 3 9 3 7 4

144 3 8 8 8 8 9 8 8 8 8 8 8 7

1 4 5 3 8 7 8 7 3 5 5 5 6 8 5 6

1 46 3 7 7 6 7 6 5 6 7 7 5 7 6

1 4 7

 3 8 8 7 8 5 6 5 6 6 3 5 3

1 48

 3 5 7 5 7 6 6 6 6 5 5 6 7

149 3 9 9 9 9 7 7 7 7 9 9 9 7

1 5 3 8 8 7 8 5 5 5 7 1 5 6 8

1 5 1  3 8 5 8 8 8 8 8 8 7 2 6 6

1 5 2  3 9 9 9 9 5 6 5 9 9 8 8 5

1 5 3  3 9 9 8 9 8 8 8 9 8 9 8 8

1 5 4   8 8 8 8 7 7 7 8 8 8 7 7

1 5 5 3 6 5 5 5 3 2 3 2 1 2 5 3

1 5 6

 3 5 5 3 5 5 5 5 7 5 3 7 7

1 5 7

 3 7 6 7

 7 3 3 4 4 3 6

 7 4

1 5 8 3 7 6 6 7 5 5 7 6 7 4 8 6

1 5 9  3 9 8 9 9 6 6 6 7 7 9 8 9

160

 3 7 7 7 7 6 6 6 2 8 8 6 7

1 6 1  3 8 8 3 7 5 5 5 4 7 5 6 7

162

 3888888897 3 72

1 6 3  3 9 9 9 9 1 1 1 1 6 5 6 9

164

 3 9 9 9 9 6 6 7 6 6 3 8 7

1 6 5 3 7 7 6 7 6 7 6 6 6 7 5 6

1 6 6 3 9 5 9 8 9 9 9 3 9 8 9 7

 66  APPENDIX A

Page 161: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 161/185

Page 162: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 162/185

A P P E N D I X

  B

SPSS  Syntax

  to

  Execute

 a

  Best-Fit Factor  Rotation

SET PRINTBACK=LISTING .

COMMENT Type in the pattern matrix that is the target as

matrix  A ; Then  type in the pattern matrix to be

rotated  to

  'best-fit'

  position with  the  target  as

matrix  B ;  Type in a matrix  N_A consisting of all

zeroes with the same number or rows as A and B ;

Type in a  square matrix of all  zeroes with the  number

of

  rows and columns equal to the number of orthogonal

factors

  in A and B .

MATRIX

  .

COMMENT 100 Grad Students,

COMPUTE  A =

{  .92134, .18690,

.21945,

.28211,

.28360,

.84662,

.85348,

.82875,

.80378,

.13580,

Varimax Pattern Matrix

.84248,

.75606,

.80837,

.22292,

.17110,

.29969,

.22454,

.30781,

.15795,

.27222,

.18737

.25625

.35335

.19958

.26935

.03794

.22423

.26075

.81976

.23077,

 .80478

.20840, .80207 } .

COMMENT 100 Faculty, Varimax Pattern Matrix

COMPUTE B =

.18675

 ;

.21537

.16198

.24099

.21722

.21639

.19870

.14477

.66769

.79192

.71967

 }

{  .20284

.10828,

.23051,

.16361,

.90807,

.89654,

.90832,

.79337,

.33664,

.18185,

.11897,

COMPUTE

 N_A =

{ -0 ;

. 0 ;

.0

  ;

.0 ;

.0 ;

.89659,

.87403,

.73176,

.87285,

.12646,

.13470,

.17788,

.32388,

.10530,

.22943,

.33942,

169

Page 163: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 163/185

.0

  ;

-0

 

.0  ;

  0 ;

  0 ;

.0

  } .

COMPUTE DIAG_M

 =

{ . 0 , . 0 , . 0 ;

 0 , . 0 , . 0 ;

.0,

 .0, .0 } .

COMPUTE N_B=N_A  .

PRINT A /

FORMAT='F8.2 ' /

TITLE=

1

First Pattern Matrix (Target)' /

SPACE=4

 /

RLABELS=Perl, Per2, Per3, Per4, PerS, Per6,

Per7, Per8, Per9,  PerlO, Peril

 /

CLABELS=Fact_I, Fact_II, Fact-Ill

  / .

COMPUTE A_N=A

 .

-LOOP #1=1

 TO

 NROW(A)

 .

+

 LOOP  #J = 1 TO NCOL(A)  .

COMPUTE A_N(#I,#J)=A(#I,#J)  ** 2 .

+

  END

 LOOP

  .

-END LOOP  .

PRINT

 A_N /

FORMAT='F8.4' /

TITLE='First Pattern Matrix (Target) Squared' /

SPACE=4 /

RLABELS=Perl, Per2, Per3, Per4, Per5, Per6,

PerV, Per8, Per9,  PerlO, Peril

 /

CLABELS=Fact_I, Fact_II, Fact_III

 / .

-LOOP #J=1 TO NCOL(A)  .

+

 LOOP #1=1 TO NROW(A) .

COMPUTE N_A(#I)=A_N(#I,#J) + N_A(#I) .

+

  END LOOP  .

-END LOOP

 .

PRINT

 N_A /

FORMAT='F8.3' /

TITLE=

>

Row Sum of Squares for First Pattern Matrix' /

SPACE=4

 /

RLABELS=Perl, Per2, Per3, Per4, PerS, Per6,

PerV,

 PerS, Per9,  PerlO, Peril  / .

LOOP

 #1=1

 TO

 NROW(A)

 .

COMPUTE N_A(#I)  = 1.0 / (N_A(#I)  ** .5) .

END LOOP  .

PRINT

 N_A /

FORMAT='F8.3'

 /

TITLE='Normalization Factor

 for

 Rows'

 /

SPACE=4

 /

RLABELS=Perl, Per2, PerS, Per4, PerS, Per6,

170

  APPENDIX

 B

Page 164: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 164/185

Per7, Per8, Per9,  PerlO,  Peril  / .

-LOOP  #J=1 TO NCOL(A) .

+  LOOP #1=1

  TO

 NROW(A)

  .

COMPUTE A_N(#I,#J)=A(#I,#J)

  *

 N_A(#I)

  .

+  END LOOP .

-END LOOP

 .

PRINT A_N /

FORMAT='F8.4' /

TITLE='First Pattern Matrix (Target) Normalized' /

SPACE=4 /

RLABELS=Perl, Per2,  Per3,  Per4, Per5, Per6,

PerV,  Per8, Per9,

  PerlO,

  Peril

 /

CLABELS=Fact_I, Fact_II,

  Fact__III

  / .

PRINT B /

FORMAT='F8.2'

 /

TITLE='Second Pattern Matrix'

 /

SPACE=4

 /

RLABELS=Perl, Per2, Per3, Per4, PerS, Per6,

PerV,

  PerS, Per9,

  PerlO,

  Peril /

CLABELS=Fact_I, Fact_II, Fact_III

  / .

COMPUTE B_N=B .

-LOOP #1=1 TO NROW(B) .

+  LOOP #J=1

 TO

  NCOL(B)

  .

COMPUTE B_N(#I,#J)=B(#I,#J)

  ** 2 .

+ END LOOP .

-END LOOP  .

PRINT

 B_N /

FORMAT='F8.4' /

TITLE='Second

  Pattern Matrix Squared' /

SPACE=4 /

RLABELS=Perl, Per2, Per3, Per4,

  Per5,

  Per6,

PerV,  PerS, Per9,  PerlO,  Peril /

CLABELS=Fact_I, Fact_II, Fact_III  / .

-LOOP  #J=1 TO NCOL(B) .

+

  LOOP #1=1 TO NROW(B) .

COMPUTE N_B(#I)=B_N(#I,#J)

  +

 N_B(#I)

  .

+  END  LOOP  .

-END LOOP

  .

PRINT N_B /

FORMAT='F8.3'

 /

TITLE='Row  Sum of Squares for  Second Pattern Matrix' /

SPACE=4 /

RLABELS=Perl,

  Per2,

  Per3,

  Per4, Per5, Per6,

PerV,  PerS, Per9,  PerlO,  Peril  / .

LOOP #1=1 TO NROW(B) .

COMPUTE N_B(#I)

  = 1.0 /

  (N_B(#I)

  ** .5) .

END

 LOOP .

PRINT

 N_B /

FORMAT='F8.3' /

TITLE='Normalization Factor for Rows' /

A P P E N D I X

  B 171

Page 165: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 165/185

SPACE=4

 /

RLABELS=Perl, Per2,  Per3, Per4, Per5, Per6,

Per7, PerS, Per9,  PerlO, Peril / .

-LOOP #J=1 TO NCOL(B) .

+

 LOOP #1=1  TO NROW(B)  .

COMPUTE B_N(#I,#J)=B(#I,#J)

  *

 N_B(#I)

  .

+  END LOOP  .

-END LOOP  .

PRINT B_N /

FORMAT='F8.4'

 /

TITLE='Second Pattern Matrix Normalized'

 /

SPACE=4

 /

RLABELS=Perl, Per2,

  Per3,

 Per4, PerS, Per6,

Per?, PerS, Per9,

  PerlO,

 Peril /

CLABELS=Fact_I, Fact_II, Fact-Ill / .

COMPUTE

 A_T=TRANSPOS

 (A_N)

 .

PRINT

 A_T /

FORMAT=

 'F8.2' /

TITLE='A_N Transpose'

 /

SPACE=4

 /

RLABELS=Fact_I, Fact_II, Fact-Ill /

CLABELS=Perl, Per2, Per3, Per4, PerS, Per6,

Per7, PerS, Per9,  PerlO, Peril

  / .

COMPUTE B_T=TRANSPOS(B_N)  .

PRINT B_T /

FORMAT='F8.2'

 /

TITLE='B_N

 Transpose' /

SPACE=4

 /

RLABELS=Fact_I,

 Fact_II, Fact-Ill /

CLABELS=Perl, Per2, Per3, Per4, PerS, Per6,

PerV,

 PerS, Per9,  PerlO, Peril / .

COMPUTE RI=A_T * B_N .

PRINT

 RI /

FORMAT= > F8.3 ' /

TITLE='A_N Transpose times B_N' /

SPACE=4 /

RLABELS=Fact_I, Fact_II, Fact-Ill /

CLABELS=Fact_I,

 Fact_II, Fact-Ill / .

COMPUTE RI_T=TRANSPOS(RI) .

PRINT

 RI_T /

FORMAT='F8.3' /

TITLE='Transpose of  (A_N Transpose times B_N)' /

SPACE=4 /

RLABELS=Fact_I, Fact_II, Fact-Ill /

CLABELS=Fact_I, Fact_II, Fact-Ill

  / .

COMPUTE QUAD=RI  * RI_T  .

PRINT QUAD

 /

FORMAT='F8.3' /

TITLE='A_N Trans * B_N * Trans of (A_N Trans * B_N)' /

SPACE=4 /

] 72

  APPENDIX

 B

Page 166: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 166/185

Page 167: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 167/185

TITLE=

>

D= trans (trans A times B) times Eigenvectors' /

SPACE=4 /

RLABELS=Fact_I,

 Fact_II, Fact_III /

CLABELS=Fact_I, Fact_II, Fact_III / .

-LOOP  J=l TO NCOL(A)  .

COMPUTE EE=EIG(J) .

+ LOOP

  1=1 TO

 NCOL(A)

  .

COMPUTE D(I,J)=D(I,J) * EE .

+  END LOOP .

-END LOOP .

PRINT D /

FORMAT=

>

F9.3' /

TITLE='D

 = D

 times Eigenvalues

 **

 -1.5'

 /

SPACE=4 /

RLABELS=Fact_I,

 Fact_II, Fact_III /

CLABELS=Fact_I,

 Fact_II, Fact_III  / .

COMPUTE D_T=TRANSPOS(D)  .

PRINT D_T /

FORMAT='F9.3' /

TITLE='D transposed' /

SPACE=4

 /

RLABELS=Fact_I,

 Fact_II, Fact_III

 /

CLABELS=Fact_I,

 Fact_II, Fact_III / .

COMPUTE C=EIGVEC

 * D_T .

PRINT

 C I

FORMAT='F9.3'  /

TITLE='Factor Correlations (Cosines)'

 /

SPACE=4

 /

RLABELS=Fact_Ia, Fact_IIa, Fact_IIIa /

CLABELS=Fact_Ib,

 Fact_IIb, Fact_IIIb

 / .

COMPUTE  C=D * VEC_T  .

COMPUTE B_ROT=B * C .

PRINT

 B-ROT /

FORMAT='F8.3'

 /

TITLE='B rotated to  Best-Fit with A' /

SPACE=4

 /

RLABELS=Perl,

 Per2, Per3, Per4, Per5, Per6,

PerV, Per8, Per9, PerlO, Peril /

CLABELS=Fact_I, Fact_II, Fact_III / .

COMPUTE BROT_N=B_ROT

 .

-LOOP

 #1=1 TO NROW(A) .

+ LOOP #J=1 TO NCOL(A)  .

COMPUTE BROT_N(#I,#J)=B_ROT(#I,#J)

  ** 2 .

+

 END

 LOOP

 .

COMPUTE N_A(#I)=  .0 .

-END LOOP .

PRINT BROT_N /

FORMAT='F8.4' /

TITLE=

>

Best

 Fit Pattern

 Matrix

  (Target)

 Squared'

 /

SPACE=4 /

174

  APPENDIX

 B

Page 168: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 168/185

RLABELS=Perl, Per2, Per3, Per4, Per5, Per6,

PerV, Per8, Per9, PerlO, Peril

 /

CLABELS=Fact_I, Fact_II,

  Fact__III

  / .

-LOOP

  #J=1 TO NCOL(A) .

+

 LOOP

  #1=1 TO

 NROW(A)

 .

COMPUTE N_A(#I)=BROT_N(#I,#J)  + N_A(#I) .

+

 END

 LOOP

 .

-END LOOP .

PRINT N_A /

FORMAT='F8.3'  /

TITLE=

1

Row

  Sum of Squares  for

 Best

  Fit

 Matrix'

 /

SPACE=4 /

RLABELS=Perl, Per2, Per3, Per4, Per5, Per6,

PerV,

  Per8, Per9,  PerlO,  Peril  / .

LOOP #1=1  TO  NROW(A)  .

COMPUTE N_A(#I)

  = 1.0 /

  (N_A(#I)

  ** .5 } .

END

 LOOP

 .

PRINT N_A /

FORMAT='F8.3'

 /

TITLE='Normalization Factor

 for

 Rows'

 /

SPACE=4

 /

RLABELS=Perl, Per2, Per3, Per4, Per5, Per6,

PerV,  Per8, Per9,

  PerlO,

  Peril

  / .

-LOOP

  #J=1

 TO

 NCOL(A)

 .

+

  LOOP #1=1 TO NROW(A) .

COMPUTE BROT_N(#I,#J)=B_ROT(#I,#J)

  *

 N_A(#I)

  .

+ END

 LOOP

 .

-END LOOP  .

PRINT BROT_N /

FORMAT='F8.4' /

TITLE=

>

Best Fit  Pattern Matrix (Target) Normalized' /

SPACE=4 /

RLABELS=Perl, Per2, Per3, Per4, Per5, Per6,

PerV,  Per8, Per9,

  PerlO,

  Peril

 /

CLABELS=Fact_I, Fact_II, Fact_III

 / .

COMPUTE BROTN_T=TRANSPOS(BROT_N)

  .

COMPUTE T_M=A_N * BROTN_T .

COMPUTE TEST=DIAG(TJVI) .

PRINT

 TEST /

FORMAT='F8.3'

 /

TITLE='Test Vector Cosines for Variables' /

SPACE=4 /

RLABELS=Perl, Per2, Per3, Per4, Per5, Per6,

PerV,  Per8 , Per9, PerlO, Peril / .

END

  MATRIX

  .

APPENDIX   B 175

Page 169: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 169/185

NOTATION

C v x v

  T he

  symm etr ic va r iance/cova r iance

  (or

  simply covar iance) ma tr ix wi th

var iances  on the  diagonal  and covar iances  of measured va r iables wi th each

other

 off the

  diagonal.

D

2

,

  The Ma ha lano bi s distan ce of the i th person's set of scores  from  the set of

means

  on the

  measured va r i ab les .

F

N x V

  The  scores  on the  latent  factors.

I P X  F

  The

  ident i ty ma tr ix which, when postmult ipl ied times

 an

  original conform-

able matr ix , y ie lds

 as an

  answer

  the

  or iginal matr ix ( l ike

  the

  n u mb e r

  1

in  algebra) .

P V X F   The f irs t-order

  factor pat tern coeff ic ient matr ix .

P F X H   T he

  second-order fa ctor pa ttern coefficient m a tri x.

P V X H   The

  product

  (first-order  b y

 second-order) factor pat tern coefficie nt m atrix .

R F X F   The

  symmetr ic ma t r ix

  of

  bi v ar i a t e correla t ions (e .g. , Pearson

  r's)

 among

the f factors.

R V X V   The

  symm etr ic ma t r ix

  of

  biv ar ia te correla t ions (e .g. , Pearson

  r's)

  among

the  v

  measured var iables .

R V X V The

  reproduced

correlation

  matrix (i .e.,

 the

  portion

  of the

  intervar iable

correla t ion matr ix  that  the  factors [i.e.,  the  pattern coefficients]

 can

 re-

produce).

R V X V   The

  residual correla t ion ma tr ix ( i .e . ,

  the

  por t ion

  of the

  i n t e r v a r i ab le

correla t ion matr ix

 that

 the  factors [i.e.,  the  pattern coefficients]

 cannot

 re-

produce) .

R V x v ' The

  inver ted intervar iable

 correlation

  matrix, which satisfies

 th e

  equat ion:

RyxV

  RvxV

  =

  IvxV'

S \ x F

  The f i r s t -order

  factor s t ructure

  coefficient

  m a t r i x .

W

V

x F

  The factor score coefficient m at rix.

Z

N

 x

 v The measured va r iables in  z score form, such

 that

  the columns have means

of

  zero

 and

  s tandard devia t ions

  (and

 v ar iances)

  of one.

177

Page 170: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 170/185

GLOSSARY

Communality coefficient h

2

). A statistic in a squared metric indicating how much

of  the variance in a measured variable the factors as a set can reproduce, or

conversely,  how  much  of the  variance  of a given measured variable w as  useful

in  delineating

  the

  factors

 as a

  set.

Component.

 A set of pattern

  coefficients

  for the measured variables that can be

 used

to estimate factor scores, derived without iterative estimation  of communality

coefficients.  (Some researchers consider  factor  a synonymous term, but  others

feel  that  factor  is a  term  that  should  be  reserved  fo r  weights derived when

iteration  is invoked.)

Critical ratio  or

 

or Wald statistic).

 A  parameter estimate divided by its standard

error,

  which  is expected  to be  greater

  than

  roughly |2.0|  for parameters

 that

one

  expects

  to he

  nonzero.

Eigenvalues

  A . ) .

 Area-world, variance-accounted-for statistics that characterize the

amount  of information present  in a given factor or function. (Eigenvalues a re

also  sometimes synonymously called characteristic roots. )

Factor rotation.  Graphic  v isua l

 or

 mathematical movement

 of the

  axes measuring

the factor space used in exploratory factor

  analysis

 so that  the factors can be

more readily interpreted.

Factor

  scores.  The  latent (synthetic  or  composite) variables computed  on each

factor (like  the Y scores  in multiple regression,  or the  discriminant function

scores

  in

 descriptive discriminant

  analys is)

  that

  ma y

 then

  b e

  used

  in

  fur ther

statistical  analyses (e.g.,  t  tests, analyses

  of

  variance)

  in  place  of the

  mea-

sured variables.

First-order factors.  The pattern or pattern/structure  coefficients  for the factored

entities (measured variables in the commonly used R-technique analysis).

General linear model GLM).

 The

  notion

  that all statistical analyses are correla-

tional  and can  yield

  effect

  sizes analogous  to r

2

, and

  apply

 weights to measured

var i ab les  to obtain scores on the composite variables that  are actually the

focus  of all  these analyses.

Identification.

  A model is said to be identified when, for a given research problem

and

  data set,

  sufficient

  constraints

  a re

  imposed such

  that there

  is a

 single

 s et

of parameter estimates yielded

  by the

 analysis.

Inadmissible.  A  statistical estimate that  is  mathematically implausible (e.g.,  a

negative variance,  or an r

 that

  is outside  th e  range  of

  — 1

 to +1) or a solution

that  contains

  one or

 more estimates  that

  are

 implausible.

Invariance.  The

  degree

  to

  which

  a

  factor

 or a

 solution

  can be

  reproduced across

samples,  across

  different

  analytic methods,  or

 both.

Inversion.  The

  matrix algebra process

 of

 solving

 to find a

  matrix that when post-

mult ipl ied by the original matrix  yields the  identity matrix; inverted matrices

are

  used

  in

 matrix algebra

 to

  accomplish division.

17 9

Page 171: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 171/185

Iteration.  A

  sta t i s t ica l

  process of est i ma t in g a set of

  results,

  and  then successively

tweaking them  u n t i l  two  consecutive sets  of  es t imates  are  sufficiently  close

that  the  i terat ion  is deemed  to  have converged.

Loadings.

  A n amb iguous and con fusing slang term used by some au thors for pat tern

coefficients,

  and b y some au thors for structure coefficients, and by some auth ors

for both even w i th in the same ar t icle even when  the  coefficients are not  equal .

Measurement error.  The

  var iance

  in

  scores  that

  is

 deemed unrelia ble; this area-

world

  proport ion can be est ima ted a s 1.0 minus a rel ia bi l i t y coefficient.

Modification

  index.

  The  est imated decrease ( improvement)  in the  model  fit  chi-

square resulting  from freeing

  a

  given prev ious ly

  fixed

  parameter instead

  to

be est imated.

Oblique.  An  angle  of more  or  less than  90°  between some rotated factors, which

means  that  the  factors  (or  components)  and  factor scores  are  correlated.

Orthogonal.  The 90° angle

  of

 al l unrotated and some rotated factors , which s tat is t i -

cally

  means  the  factors  (or components)  are  perfectly uncorrelated.

Outliers.

  Persons with aberrant

  or

 anomalous scores

 on one or

 more var iables with

regard

  to one or

  more statistics.

Pattern coefficients.  The  weights appl ied to the  measured var iables to  obtain scores

on the

  factor analysis latent va r iables (cal led

  factor

  scores). These weights

  a re

analogous

  to the (3

 weights

 in

 mu lt ip le regression,

 the

  s tandardized d iscriminant

func t ion  coefficients

  in

 descr ipt ive discr iminant  analys is ,

  and the

 s tandardized

canonical funct ion  coefficients

  in

  canonical correlat ion ana lysis .

Pattern/structure coefficients.

  The

  term

  for the

  entries

  in the

  single exploratory

factor

  analysis

  coefficient

  ma tr ix produced

  when

  factors are first extracted or

after  rotat ion when or thogonal rotat ion

  has been

  invoked,

 and in CFA

  when

factors  are

  constrained

  to be

  uncorrelated.

Procrustean rotation.  A  method that rotates a pattern (or pat te rn/s t ruc ture)  coeffi-

cient  mat r ix

  to

  best-f i t posi t ion wi th

 a

  theoretical

  or

  actual pat tern  coeffi-

cient  ma t r ix .

Reflection.  The process of chan ging a ll the signs of pat tern and s tru ctu re coefficients

on a

  given factor

  so

  tha t

  all or

  most

  of the

  largest values

  on the

  factor

  are

pos i t i ve  in sign.

Sampling error.  The  uni que , id iosyncra t ic var iance  in a  given sample  that  does

not  exist  (a) in the  popula t io n , which  has no  idiosyncrat ic var iance due to a

sampling process, or (b) in other  samples, which do have sam pling error, but

sampling  error var iances that are at least somewhat  different.

Second-order factors.

  The  pat tern  or  pat tern/s tructure  coefficients  for the  second-

order factors  defined

  by the first-order

  factors.

Simple structure.  A  theoret ical pat tern  of the  dispersion  of  near-zero  and  large

nonzero pattern  coefficients  tha t opt imizes interp reta bi l i t y of the factors by

there being (a) enough v ar iab les ref lect ing a construct and (b) most v ar iab les

180  GLOSSARY

Page 172: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 172/185

reflect ing

  only

  a

  single factor such

  that  the

  factors

  can be  both

  labeled

  and

dis t inguished  from  each other.

Structure

 coefficient.

  The

  Pearson

  product— moment

 cor re la t ion

 of a

 factored ent i ty

(measured variables

  in the commonly

  used

  R-technique

  analysis) with

  a la-

tent factor.

GLOSSARY   181

Page 173: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 173/185

REFERENCES

Anderson,

  T. W., &

  R u b i n ,

  H.

  (1956). Statistical inference

  in

  factor analysis.

Proceedings

  of

  the

  Third Berkeley   Symposium

  on

  Mathem atical Statistics

 and

  Proba-

bility,

  5,

  111-150.

Arbuckle ,  J. L, &

 Wothke,

  W.  (1999).

  AMOS

  4.0  user s

  guide.

 Chicago:  Small-

waters.

Armstrong,

  J. S.

  (1967) . Der iva t ion

  of

  theory

  by

  means

  of

  factor analysis

  or

Tom   Swift  and his  electric factor analysis  machine.  American Statistician,

2 1 ( 5 ) ,

  17-21.

Bagozzi,  R. P . ,  Fornell,  C. , &  Larcker, D. F.  (1981). Canonical  correlation analysis

as a

 special case

 of a

 structural relations model. Multivariate Behavioral  Research,

16,

  437-454.

Bandalos,  D. , &  Finney,  J. S.  (2001). I tem parceling issues  in  structural equat ion

modeling.  In G. A. Ma rcoulides & R. Schumacker

  (Eds.) , New

 developments

 and

techniques

  in structural equation  modeling   (pp.

 269-295).

 Mahwa h, NJ: Er lbaum.

Bart let t ,

  M. S.

  (1937).

  The

  stat is t ical conception

 of

  mental factors.  British  Journal

o f   Psychology,   28 ,  97-104.

Bart let t ,

  M. S.

  (1950).

  Tests  of

  significance

  in

  factor analysis.  British  Journal   of

Psychology,  3(2 ) , 77-85.

Bentler, P. M.  (1990). Comparative fit  indexes  in  structural  models.  Psychological

Bulletin, 107,  238-246.

Bentler,  P. M.  (1994).  On the  qual i ty  of test  statistics  in  covariance structure

analysis:

  Caveat

  emptor.  In C . R .  Reynolds (Ed.) ,  Cognitive assessment:

  A

multidisciplinary perspective

  (pp.  237— 260).  N ew  Y ork: Plenum Press.

Bentler,

  P. M.

  (1995).

  E Q S

  structural equations prog ram manual.  Encino,

 C A :

 M ul t i-

var iate Software.

Bentler,  P. M., &  Bonnett, D. G.  (1980).  Significance tests  and  goodness  of fit in

the ana lysis of covariance structures. Psychological

 Bulletin,

  88, 588— 606.

Bentler,  P. M., &. Yuan, K. -H.  (2000).  On  adding a

 mean

 structure to a covariance

structure model. Educational  an d   Psychological   Measurement,   60,  326-339.

Bollen, K. A . (1 987). Outliers a nd improper solutions: A co nfirma tory factor ana lysis

example. Sociological Methods   and   Research,   15,

  375— 384-

Bollen, K. A. (1989). Structural equations

 with

  latent variables. N ew York: W iley.

Borrello, G. M., &

 Thompson,

 B. (1990). An  hierarchical analysis of the  Hendrick-

Hendrick  measure  of  Lee's typology  of  love. Journal

  of   Social

  Behavior

  and

Personality,  3,

  327-342.

Byrne,  B. M.  (1994). Structural

 equation

 modeling with  EQS and

 E QS/Windows:   Basic

concepts,

  applications,

 and

  programming.

  Thousand

  Oaks,

  CA:

  Sage.

Byrne, B. M.

  (1998).

  Structural equation modeling  with

  LISREL, PRE LIS, and

SIMPL IS: Basic concepts, applications, and programm ing. Ma hwa h, N J: Erlb aum .

183

Page 174: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 174/185

Byrne, B . M.  (2001). Structural

 equation

 modeling

 with  AMOS:

 Basic

 concepts,   applica-

tions,

  and

 programming.

 Mahwah ,  N J:  Erlbaum.

Carr ,  S.  (1992) .  A  primer  on the use of Q-technique  factor analysis. Measurement

and

  Evaluation in Counseling and Development, 25,  133-138.

Cattell, R . B .

 (1953).

  A

  quanti ta t ive analysis

 of the

  changes

  in the

  culture pat tern

of

 Great  Brita in ,

  1837-1937,  by

  P-technique. Acta

  Psychologia,   9 ,

  99-121.

Cattel l ,  R . B .  (1966a).  Th e  data box:  Its  ordering  of  total resources  in  terms  of

possible

  relational systems.  In R. B.

  Cattell

  (Ed.) ,  Handbook   of  m ultivariate

experimental

  psychology

  (pp. 67-128).

 Chicago:

  Rand McNal ly .

Cattel l ,

  R . B .

 (1966b).

  The

  scree test

  for the

  number

 of

 factors. M ultivariate Behav-

ioral  Research, 1, 245-276.

Cattel l ,

  R. B.

  (1978). T he   scientific  use   o f  factor   analysis in behavioral and

  l i f e

  sciences.

N ew

  York: Plenum.

Cliff ,

  N .

  (1987).  Analyzing multivariate

 data..

  Sa n

  Diego,

  C A :

  Ha rcourt Brace Jova-

novich.

Cohen,

 J.

 (1968). Mu ltip le regression

  as a

 general da ta -an aly tic system. Psychological

Bulletin,  70, 426-443.

Cohen,  J., &

 Cohen,

  P. (1983).

  Applied multiple regression/correlation analysis

  for the

behavioral

  sciences. Hillsdale, N J:  Er lbaum.

Cohen, J., Cohen, P ., West, S. G., & A iken,  L. S. (2003). Applied   multiple   regression/

correlation  analysis  for the   behavioral   sciences  (3rd ed.). Mahwah,  N J:  Er lbaum.

Cook,  C .,  Heath, F., &. Thompson,  B .  (2001). Users' hierarchical perspectives  on

library

  service quality:

 A

  L i b Q U A L + ™ s t u d y .

 College

  an d  Research Libraries,

62,  147-153.

Cook,  C. , &. Thompson,  B.

  (2000). Higher-order factor analytic perspectives

  on

users'

 perceptions  of lib rary service quality. Library Information   Science

 Research,

22,

 393-404.

Cook,  C., &.

 Thompson,

  B.

  (2001). Psychometric properties

  of

  scores  from

  the

W eb-based L ibQU AL +™ s tudy of perceptions  of library service qua lity. Library

Trends,

  49,

  585-604.

Cooley,  W . W ., &Lohnes, P . R . (1971). M ultivariate  data  analysis. N ew York: W iley.

Courville, T., &.

 Thompson,

  B. (2001). Use of structure coefficients in published

multiple  regression articles:

  (3 is not

  enough.  Educational

  and

  Psychological

Measurement,  61,

  229-248.

Cronkhite,  G., &  Liska, J. R .  (1980).  The  judgment of communicant acceptabil i ty .

In M. R.  Roloff  & G. R. Miller (Eds.) , P ersuasion: N ew

  directions

  in   theory  an d

research  (pp.  101-139).  Beverly Hills,  C A :  Sage.

Grossman, L. L.  (1996).  Cross-validation ana lysis for the ca nonical case. In

B.

 Thompson (Ed.) , Advances in  social

  science methodology

  (Vol. 4, pp.

 95-106).

Greenwich,

  C T: JAI

  Press.

Cudeck , R .

 (1989).

  The analysis of correlation mat rices using covariance s truc ture

models.  Psychological

  Bulletin, 105,

  317-327.

184   R E FE R E NCE S

Page 175: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 175/185

Gumming,

 G., & Finch, S. (2001). A primer on the understanding, use and calcula-

tion of confidence intervals

 that

 are based on central and noncentral  d i s t r ibu-

tions.  Educational

  and

 P sychological

 Measurement, 61,

  532-575.

Davis,

 W. R.

 (1993).

 The FC1

 rule

 of

 identification

 for

 confirmatory

 factor

 analysis:

A general sufficient  condition.

  Sociological   Methods

  and Research, 21,

 403-437.

Diaconis,

 P., &.

 Efron,

 B.

 (1983). Computer-intensive methods

  in

 statistics.  Scientific

American,  248(5),

  116-130.

Duncan,  O. D.  (1975).  Introduction

  to

 structural

  equation

  models.  New York: Aca-

demic Press.

Dunlap,

  W. P., &

  Landis,

  R. S.

  (1998). Interpretations

  of

  multiple regression

borrowed

  from factor

  analysis

  and  canonical correlation.  Journal

  of

 General

Psychology,  125, 397-407.

Fabr igar ,  L. R.,

  Wegener,

  D. T.,

  MacCallum,

  R. C., &  Strahan,  E. J.

  (1999).

Evaluating

  the use of  exploratory

 factor

  analysis  in  psychological research.

Psychological Metho ds,

  4,

  272-299.

Fan,

  X.,

 Thompson,

  B., &

  Wang,

  L.

 (1999).

 The

  effects

  of sample

  size, estimation

methods,

 and

 model specification

 on SEM fit

 indices.

 Structural Equation

 Model-

ing,  6,

  56-83.

Fan,  X .,

 Wang,

  L., &  Thompson,  B . (1997, March).  Effects

  of data

  nonnormality   on

fi t  indices   and

  parameter estimates

  for true and

  misspecified

  SE M

  models.  Paper

presented at the annual meeting of the American Educational Research Associ-

ation, Chicago. (ERIC Document Reproduction Service

 No. ED 408

  299)

Fouladi,

  R. T.

  (2000). Performance

 of  modified

  test statistics

  in

  covariance

  and

correlation structure analysis under conditions of multivariate nonnormality.

Structural  Equation

 Modeling,

  7,

  356-410.

Frankiewicz,

  R. G., &

  Thompson,

  B.  (1979, April). A  comparison of

  strategies

 for

measuring   teacher

  brinkmanship behavior. Paper presented

 at the

  annual meeting

of

  the  American Educational Research Association,  San  Francisco. (ERIC

Document Reproduction Service

  No. ED 171

 753)

Gorsuch,

  R. L.

  (1983). Factor  analysis (2nd ed.). Hillsdale,

 NJ:

 Erlbaum.

Gorsuch,

 R. L.

 (2003). Factor analysis.

 In ]. A.

 Schinka

 & W. F.

 Velicer (Vol. Eds.),

Handbook

  of

  psychology:  Vol.  2.  Research   methods  in

 psychology

  (pp.

 143-164).

Hoboken, NJ: Wiley.

Graham,

  J. M.,

  Guthrie,

  A. C., 6k

 Thompson,

  B.

  (2003). Consequences

  of not

interpreting structure

  coefficients

  in published CFA research: A reminder.

Structural  Equation Modeling,  10,  142-153.

Guil ford, J . P . (1946). N ew standards for test evaluation.

 Educational   and

  Psychological

Measurement,  6,  427-439.

Guil ford, J . P .

  (1967).

  T h e

  nature of human  intelligence.

  N ew

  York: McGraw-Hill.

Guadagnoli, E., & Velicer, W. (1988). Relation of sample

  size

  to the stability of

component patterns.

  Psychological

  Bulletin, 103,

  2 6 5 — 2 7 5.

Guttman,

 L.

 (1954). Some necessary conditions

 for

 common-factor analysis. Psycho-

metrika,  19,  149-161.

R E F E R E N C E S  185

Page 176: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 176/185

Ha yduk ,

  L . A.  (1987).  Structural equation modeling with

  LISREL: Essent ia ls  an d

advances.

  Balt imore:

  Johns Hopkins

  U niv ersi ty Press.

Hendrickson,

  A. E., & White, P. O.  (1964).

 Promax:

  A

  quick

 method for rotation

to oblique simple structure.

  British Journal

  of

  Statistical Psychology, 17,

  65—70.

Henson,

  R . K .

 (1999).

  Mul t i va r i a t e normal i ty :

  What

  is it and how is it assessed?

In B.  Thompson  (Ed. ) ,

  Advances

  in   social   science methodology   (Vol. 5, pp.

193-212).  Stamford,  CT: JAI  Press.

Heywood,

  H. B.  (1931).  On

  fini te

  sequences  of

  real numbers.  Proceedings

  of the

Royal

  Statistical  Society   of London, 134,

  486-501.

Holzinger,  K. J., &. Swineford,  F.

  (1939).

  A  study   in   factor analysis:   The

  stability

  of

a   bi-factor   solution

  (No. 48).

  Chicago:

 U n i ve r s it y

  of

 Chicago  Press.

Horn, J. L. (1965). A  ra t ionale  and test for the  number  of factors  in  factor analysis.

Psychometrika,   30,

  179-185.

Horst ,  P .  (1966).  Psychological

  measurement

  an d

  prediction,

  Belmont,  C A :

W adsworth .

Hu, L, &. Bender, P. M.

 (1999).

 C utoff cr i teria for

 fit

 indexes  in

 covariance

 st ructure

analysis:

  Conventional

  cri ter ia versus

  new

  al terna t ives.

  Structural Equation

Modeling,  6,  1-55.

Huher ty ,  C .  (1994).  Applied

  discrim inant analysis.

  N ew  York: W i ley.

Huber ty , C. J., &  Morris, J. D. (1988). A  single

 contrast test

 procedure.  Educational

and   Psychological   Measurement,  48,  567-578.

Jackson,  D.

 (2001). Sample  size

  and  number  of  parameter

  estimates

  in  ma x i mum

likelihood confirmatory factor analysis: A

 Monte Carlo

 invest iga t ion.

  Structural

Equation Modeling,

  8,

  205-223.

Jones, H. L., Thompson, B., & M iller, A. H. (1980). How

 teachers

 perceive s imi lar i t -

ies and differences among v arious teaching models.

 Journal

 of   Research  in

 Science

Teaching,

  17, 321-326.

Joreskog, K. G.  (1969).  A  general

  approach

  to  confi rmatory maximum l ikel ihood

factor analysis.

  Psychometrika,

 34,  183-202.

Joreskog,

 K. G., & Sorbom, D .

 (1984).

  LISREL   VI  user s

 guide  (3rd ed.). Mooresville,

IN :

  Scientific Software.

Joreskog,

  K. G., &

  Sorbom,

  D .

  (1993).

  LISREL   8:

  User s

  reference  gu ide.

 Chicago:

Scientific Software

  International.

Kaiser, H. F.

 (1958).

 The

  va r i ma x criterion

 for

  analyt ic  rotation

  in

  factor analysis.

Psychometrika,  23,  187-200.

Kerlinger,

 F. N.

  (1979).

  Behavioral  research:

 A

  conceptual

 approach.

  N ew

  York:

 Holt,

Rinehart

  &.

 Winston.

Kerlinger, F. N.  (1984).

  Liberalism

 an d

  conservatism:

 T he

  nature

 an d

 structure

 of   social

attitudes.

  Hillsdale,

  N J:

  Erlbaum.

Kerlinger, F. N.

  (1986). F oundations

 of   behavioral

 research

 (3rd ed.). New York: Holt,

R i neh a r t  & Winston.

186

  REFERENCES

Page 177: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 177/185

Knapp ,  T. R . (1978). Canonical  correlation a nalysis: A general param etric  signifi-

cance  testing system.

  Psychological

  Bulletin, 85,  410-416.

Lautenschlager,  G. F.,  Lance,  C. E., &  Flaherty ,  V. L.  (1989). Parallel  analysis

cri ter ia:  R ev ised equa t ions  fo r  es t ima t ing  the  latent roots  of  random da ta

correla t ion matrices.

 E ducational

  an d

  Psychological

  Measurement, 49,

  339-345.

Lev ine,  M. S.  (1977) .  Canonical   analysis   an d   factor   comparison.  Beverly Hills,

CA:

  Sage.

Lunneborg,

  C . E .

  (2000). Data  analysis

  by

  resampling:

  Concepts   and

  applications.

Pacific  Grove,  CA: D uxbury.

MacCa l lum, R. C.,  Browne, M. W., & Sugawara , H. M.  (1996). Power analysis and

determina t ion

  of

  sample

  size  for

  covariance structural modeling.

  Psychological

Methods, I,  130-149.

MacCa l lum,  R. C.,  W i da m a n,  K. F.,  Zhang,  S., & Hong,  S.  (1999). Sample  size

in factor analysis.  Psychological   Methods, 4,  84-99.

Marcoulides,  G. A., &.

 Schumacker,

  R . E .

  (2001) .

 N ew

  developments

  an d

  techniques

in   structural   equation  modeling.  Mahwah ,  N J:  Er lbaum.

Mered i th , W . (1964)-  Canonical  correlations with  fallible  data .

  Psychometrika,

29,

  55-65.

Montanell i ,  R. G., Jr., &  Humphreys , L. G.  (1976 ). La tent roots  of  random data

correlation matrices with squared multiple correlations  on the  diagonal:  A

Monte  Carlo  study.  Psychometrika, 41,

  341-348.

Mula ik ,

 S. A . (Ed.) . (1992). Theme issue on prin cipa l components analysis. Multwar-

iate

  Behavioral

  Research, 27(3).

Mulaik ,

  S. A. , Ja mes, L. R . , Van A lst ine, J . , Bennett , N. , Lind, S. , &. Sti l lwell ,

C . D. (1989). A n ev alua tion of goodness of fit indic es for stru ctur al equa tion

models.  Psychological   Bulletin,  105, 430-445.

Nasser ,  F., Benson, J. , & W isenb aker, J. (200 2 ). The performance of regression-

based variations of the  v i sua l  scree for determining the number of common

factors.  Educational   and   Psychological

  Measurement,

  62,

  397-419.

Nasser, F., & W isenbaker, J.  (2003) . A M onte C arlo study invest igat ing the  impact

of i tem parcel ing on measures of fit in confirmatory factor analysis.

 E ducational

and   Psychological

  Measurement,  63,  729-757.

Nunna l l y ,

  J . C .

  (1978).

  Psychometric

  theory  (2nd ed.).

 N ew

  York: McG raw-Hi l l.

O'Connor,

 B. P.

  (2000) .

  SPSS

  and SAS

  programs

 for

 determining

  the

  number

 of

components  using  parallel analysis a nd  Velicer's  M A P  test.  Behavior Research

Methods,  Instruments, and Com puters, 32,  396-402.

Ogasawara, H. (2000). Some relationships between  factors and components.  Psycho-

metrika,  65,  167-185.

Pedhazur ,  E. J.  (1982) .  Multiple regression

  in

  behavioral research:

  Explanation and

prediction

  (2nd ed.).

 N ew

  York:  H o l t, R ineha r t

  &.

 W ins ton .

R ussell,  D . W .

  (2002) .

  In

  search

  of

  underlying dimensions:

  The use

 (and abuse)

of

  factor analysis  in  Personality  and  Social  Psychology

  BuUetin.

  Personality   and

Social

  Psychology

  Bulletin,

 28,

  1629-1646.

R E F E R E N C E S

  187

Page 178: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 178/185

Satorra ,  A., &  Bender ,  P. M.  (1994) .

  Corrections

  to  test stat ist ics and  s tandard

errors

  in

  covariance structural analysis.

 In A. von Eye & C. C. Clogg

 (Eds.) ,

Latent

  variable

  analysis:   Applications   fo r

  developmental

  research   (pp.

 399-419).

Thousand

  Oaks,

  CA:

  Sage.

Schmid ,

 J., &

  Le i man ,

 J.

  ( 1 9 5 7 ) .

 The

  development

  of

 hierarchica l factor solutions.

Psychometrika,

  22,

 53-61.

Schumacker , R . E. , & M arcoulides , G. A .

  (1998).

  Interaction

 and

  nonlinear

  e f f e t s

in   s tructural equation modeling.

 M a h w a h ,

  N J:

 E r l b a u m .

Snook,

 S. C., & Gorsuch,  R. L.  (1989).  Component  analysis versus common factor

analysis: A Monte Carlo  study.

  Psychological

  Bulletin, 106,  148-154.

Spearman,

 C.

  (1904) .

  General

  intelligence, ob ject iv ely determined

 and

 mea sured .

American journal   of   Psychology,   15,  201— 293.

Steiger,

  J. H., &.

 Lind ,

  J. C.

  (1980, June). Statistically

 based   tests   for the

  number

 o f

common

  factors.

  Paper presented

  at the

  annual meet ing

  of the

  Psychometr ic

Society,

  Iowa

  C i t y , 1A .

Stephenson,

  W .  (1953) .

  T he   study   of behavior:

  Q-technique

  and its

 m ethodology .

Chicago:

 U n i ve r si t y

 of Chicago

  Press.

Ta t suoka ,

  M. M.

  ( 19 7 1 ) . M ultwariate a nalysis: Techniques  fo r

  educational

  an d

 psycho-

logical  research. New York : W i ley .

Thompson,  B.

  (1980a). Comparison

  of two

  strategies

  for

  collecting

  Q-sort

  da ta .

Psychological

  Reports,

 47,  547-551.

Thompson,

  B. (1980b) . V ali di t y of an ev alu ato r typology.  Ed ucational Ev aluation

and

  Policy

  Analysis ,  2,  59-65.

Thompson,

 B . (1984). C anonical

  correlation

 analysis:

  Uses

  an d  interpretation.

 N e w b u ry

Park ,

 CA:  Sage.

Thompson, B. (1988).

  Program

 FACSTRAP: A

  program

  that

  computes boots t rap

estimates

  of

  factor s t ructure .  Educational

  an d

  Psychological   Measurement,

  48,

681-686.

Thompson,

 B. (1989). M eta-an alysis of

 factor

 stru cture studies: A case study exam ple

wi th Bern's androgyny mea sure. Journal

  of  Experimental

  Education, 57,

  187-197.

Thompson, B. (1990a). MULTINOR: A FORTRAN program

 that

 assists in e va lu a t -

in g

 mul t i v a r i a te normal i ty . Educa tiona l

 and Psychological

 M easurement,

 50,845-

848.

Thompson,  B.  (1990b).

  SECONDOR:

  A  program  that  computes  a  second-order

pr incipal

 components analys is and v ar iou s in terpreta t ion a ids . Educational and

Psychological

 Measurement,

  50,

  575-580.

Thompson,  B.

  (1991).

  A

  pr imer

  on the

  logic

  and use of

  canonica l correlation

analysis. M easurement and Ev aluation in Counseling and D evelopment, 24,  80-95.

Thompson, B.

 ( 19 9 2a ) .

 DISCSTRA: A

  computer program that computes boots t rap

resampling es timates of descr ipt ive discr imina nt ana lys is funct ion and s t ructure

coefficients and

  group centroids.  Educational

 an d   Psychological

  Measurement,

52,

 905-911.

188  R E F E R E N C E S

Page 179: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 179/185

Thompson, B. (1992h). A partial test distribution for cosines among factors across

samples.

  In B.

 Thompson  (Ed.),

 Advances

  in   social   science

  methodology  (Vol.

  2 , .

pp. 81-98).

 Greenwich,

  CT: JAI

 Press.

Thompson,

  B.

 (1993a). Calculation

  of

 standardized,

 noncentered

 factor scores:

  An

alternative to conventional

  fac tor

 scores.

  Perceptual and

 Motor

  S k i f f s ,

  77, 1128-

1130.

Thompson,

 B.

 (1993b).

 The use of

 statistical significance tests

 in

 research: Bootstrap

and other alternatives,

 journal

  of  Experimental

  Education,

  61,  361-377.

Thompson,  B.  (1994).  The  pivotal role  of  replication  in  psychological research:

Empi r i ca l ly

 evaluating

 th e

  replicability

 of

 sample results. Journal   of   Personality,

62 ,

  157-176.

Thompson, B. (1995). Exploring the replicability of a study's results: Bootstrap

statistics

  for the  multivariate case.

  Educational  and

  Psychological

  Measurement,

55, 84-94-

Thompson,

  B.

  (1996). AERA editorial policies regarding statistical significance

testing: Three  suggested reforms.

  Educational

  Researcher,  25(2),

 26-30.

Thompson,

  B.

 (1997).

  The

  importance

  of

 structure coefficients

  in

  structural equa-

tion modeling confirmatory factor analysis.

  Educational   and

  Psychological

  Mea-

surement,

 57,  5-19.

Thompson, B.

 (2000a). Canonical correlation analysis.

 In L.

 Grimm

  & P.

 Yarnold

(Eds . ) ,  Reading   and understanding

  more  multivariate

  statistics  (pp.  285-316).

Washington,

  DC:

  American Psychological Association.

Thompson,  B.  (2000b). Q-technique factor analysis: One  variation  on the  two-

mode

  fac tor

  analysis of variables. In L. Grimm  & P.  Yarnold (Eds.),

  Reading

and  understanding

 more multivariate

 statistics  (pp.  2 0 7— 2 2 6 ) .

  Washington,  D C :

American Psychological Association.

Thompson,

  B.

  (2000c).

  Ten

  commandments

  of

 structural equation modeling.

  In

L. Grimm & P. Yarnold (Eds.),  Reading   and  understanding more

 multivariate

statistics

  (pp.  261-284).  Washington,

  DC:

  American Psychological Associ-

ation.

Thompson, B. (2002). Statistical, practical, and clinical : How many kinds of

significance

  do counselors

  need

  to consider? Journal

  of  Counseling  Develop-

ment, 80,

 64-71.

Thompson,

  B .

  (Ed.). (2003).

  Score

  reliability:

  Contemporary

  thinking   on

  reliability

issues.

 Newbury Park,

 CA:

  Sage.

Thompson,

 B., &  Borrello,  G. M.  (1985).  The  importance  of structure  coefficients

in regression research.

 Educational

 and Psychological

 Measurement,

 45,  203-209.

Thompson,  B., & Borrello, G. M. (1986). Second-order factor structure of the  MBTI:

A construct

  v a l i d i t y

 assessment.

 Measurement

 and

 Evaluation

 in Counseling and

Development,  18, 148-153.

Thompson, B., &

  Borrello,

  G. M.

  (1987,  J an u a r y ) . Comparisons  of

  factors   extracted

from the

 correlation

 matrix

 versus

  th e

 covariance

 matrix: A n

  example using

  th e Love

Relationships

  Scale.

  Paper presented  at the  annual meeting  of the  Southwest

REFERENCES

  189

Page 180: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 180/185

Educational Research Association, Dallas. (ERIC Document Reproduction

Service

  No. ED 280

  862)

Thompson, B.,

 Cook,

  C., &

  Heath,

  F.

 (2001).

 How

 many dimensions does

  it

 take

to

  measure users' perceptions

  of

  libraries?:

  A

  LibQUAL+™

study,

  portal:

Libraries

  and the

 Academy,

  1 ,

  129-138.

Thompson,  B., Cook,  C., &

  Heath,

  F.

 (2003). Structure

 of

 perceptions

 of

 service

qual i ty

  in

  libraries:

 A

 LibQUAL+™ study. Structural

 Equation

  Modeling,

  10,

456-464.

Thompson, B., Cook,  C., & Thompson, R. L.

 (2002). Reliability

 and

  structure

 of

LibQUAL+™ scores: Measuring perceived  library service quality, portal: Lib rar-

ies   and the Academy,  2 ,

  3-12.

Thompson,

 B., &. Daniel, L. G. (1996). Factor analytic evidence for the construct

va l id i ty  of

 scores:

 A n

  historical overview

 an d

 some guidelines. E ducational

 a nd

Psychological  Measurement,

 56,  197-208.

Thompson,  B., &.

  Pi t t s ,

  M. C .

  (1982).

  The use of

 factor adequacy coefficients.

Journal

  of

  Experimental

  Education,  50,

  101— 104.

Thurstone,  L. L.  (1935).  The  vectors

  of the

 mind.

  Chicago:

 University  of

 Chicago

Press.

Thurstone,  L. L.

  (1947).

  Multiple

  factor   analysis.

  Chicago:

 University

  of

  Chicago

Press.

Tucker,

  L. R.

  (1966).

  Some

  mathematical notes

  on

  three-mode factor analysis.

Psychomemka,  31,  279-311.

West, S. G., Finch, J. F., & Curran, P. J. (1995). Structural equation models with

nonnormal data. In R. H. Hoyle

  (Ed.) ,

 Structural

 equation

 modeling

 (pp.

 56-75).

Thousand Oaks,

  CA: Sage.

Wilcox, R. R. (1998). How many discoveries have been lost by ignoring modern

s ta t is t ical  methods? American  Psychologist,  53,  300-314.

Wilkinson,  L., &  Task Force  on  Statistical Inference,  APA  Board  of Scientific

Affairs.  (1999). Statistical methods in psychology journals: Guidelines and

explanations. American

  Psychologist,

  54,

  594-604.

  [Reprint available

  from

http://www.apa.org/journals/amp/amp548594.html]

Vacha-Haase,  T.  (1998). Reliability generalization: Exploring variance in measure-

ment error

 affecting

 score

 rel iab i l i ty

 across studies.

 E ducational

  and

  Psychological

Measurement,

  58,

  6-20.

Vacha-Haase,  T., Kogan, L. R., & Thompson,  B. (2000). Sample compositions  and

variabi l i t ies

  in

 published studies versus those

  in

 test manuals: Validity

 of

 score

re l iab i l i ty inductions. Educational

 an d

 Psychological

 Measurement,  60 ,  5 0 9— 5 2 2 .

Veldman,  D. J.  (1967).  Fortran programming

  for the

  behavioral

  sciences.

 N ew  York:

Holt, Rinehart

  &

  Winston.

Zwick, W . R ., &.

 Velicer,

 W . F . (1986).

 Factors influencing

 five rules for

 determining

the number of components to retain. Psychological B ulletin,  99, 432-442.

190  R E F E R E N C E S

Page 181: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 181/185

INDEX

ADF estimation theory.  See  A symptot i -

cally

  distribution-free est ima tion

theory

Alpha factor extract ion,  38

AMOS,

  111, 120,  131

Analysis  of

  var i ance (ANOVA) ,

  12

Ana ly t i c  purpose, 70

A nderson-R ubin est imation method,

  4 4

  Area world statistics,  11, 12

Asymptotically distribution-free (ADF)

estimation theory, 127,  13 8

Bartlett  method,

  58

Bell  curves,  121-122

Beta (]3) weights,  16

Binet , Alfred,  3

Bivar ia te

  normali ty,

  122

Bootstrap

  methods,

  102-108

in  CFA/SEM,  129-130

in  EFA,  128-129

Canonical  factor analysis, 38

CFA.  See C onfirmatory factor analysis

C F1  (comparat ive  fit  index),  130

Characteristic values.  Se e  Eigenvalues

Communality coefficients, 19-21

in i t ia l  estimates, 38, 51

as  rel iabil i t ies ,  99

as

  R

2

  v alues, 61-64

Comparat ive fi t index (CFI), 130

Composite  variables (latent variables,

synthetic va riables), 9, 10

Confirmatory factor analysis (CFA),

5-7,  30,

 133-151

alternative estimation theories  in ,

138,  139

and  comparisons  of model  fits,  142 ,

144-145

decision

  sequence in,  109-132

EFA

  vs., 110-114

and

  higher-order models, 1 4 5 — 1 5 0

matrix of

  association coefficients

  in ,

120-121

maximum  likelihood theory

  in, 138

model identifica t ion in,

  118-120,

134-137

mult ivar ia te

  normali ty in,

  121-123

outl ier  screening  in ,  123—126

and parameter est imation theory,

126-127

pattern  and

  structure

 coefficients  in ,

138,

  140-143

postanalysis  decisions  in ,  127—131

preanalysis

 decisions

  in ,

  114—127

Confirmatory

  rotat ion,

  93-95

Conformabili ty,  13-14, 25

Constructs,

  5

Convergence,  37

C orrelated error v ari anc es, 140,

  14 2

Correlat ion  coefficient  (r), 10-11

Correlat ions

Interpret  by squar ing, 19

Covariance matrix,

  30—31

C ovariance structure analysis.

 Se e

 Struc-

tural

 equation

  modeling (SEM)

Crit ical rat io  (t   test), 105, 123

Cross-validat ion, EFA,  101-103

Data

intervally

  scaled,

  29, 120

  Data box (C at te l l) , 83-86

Delta

  values,

  44

Diagonal

of  a

  matr ix ,

  14

Disconfirmability,  115-118

Division  of  matrices,  15, 25

EFA.  See  Exploratory factor analysis

Eigenvalue

  greater  than  1.0 rule (Kl

rule), 32

Eigenvalues

  (character is tic v alues),

21-22

  Elbow (scree plots ),

  33

191

Page 182: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 182/185

Equality constraints,  113, 14 7

Error(s)

correlated error variances,  140, 14 2

measurement,  50-52, 99

model specification, 99-100

root-mean-square error  of approxima-

t ion  ( R M S E A ) ,  130

sampling,

  50-52,

 1 00

Est imates

admissible, 135

Estimation theory(-ies)

asymptotically distribution-free,  127,

138

maximum likelihood,

  126—127

ordinary least squares  (OLS),  138

Exploratory

  factor analysis (E FA ),

5-7,9

bootstrap,  EFA, 102-108

C FA   vs., 110-114

cross-validat ion, EFA, 101-103

factor

  extraction  method  in ,  36—38

factor  rotat ion  in ,

 38-44

factor  scores

  in ,  44—47

linear  sequence of decisions in,

27-28,

 47-48

matrix of

  association  coefficients

  in ,

28-31

number

  of

 factors

  in ,  31—36

orthogonal vs. oblique solutions in,

71

sample

  size  in ,  2 4 — 2 5

Ex terna l replicab ility analyses, 100

Factor adequacy, 94

Factor analysis

development

  of, 3

and

  nature

  of

  constructs,

  5

an d

  parsimony,

 5

purposes

  of, 4-5

two  major classes  of, 5— 7

and   val id i ty ,  4 -5

Factor extraction

alpha,

  38

image,

  38

maximum likelihood,  38

Factor interpretation

naming  of  factors,

  97—98

salience,

  97

Factor rotation,  38-44.  Se e

 also

  Oblique

rotations

Factors

naming  of ,

 97-98

number of (in EFA), 31-36

possible number

  of,  17—18

reflection

  of , 96-97

Factor scores,

  16 ,

 44-47

Anderson— R ubin ,  4 4

Bart let t ,

  58

in different extraction  methods,

57-61

First-order

  factors, 72

Fi t  statistics, 127-130

General

  linear

  model (GLM), 7, 109

interpretat ions  in ,  7 1 — 7 2

weights  in ,  15—16

  G

theory

  of

  intelligence,

  3

Guilford,  Joy,  4 -5

Heywood cases,

  20

Higher-order factors

CFA,  145-150

EFA, 72-81

Identif ication,

  model,

  118-120,

134-137

Ident i ty  matr ix , 14-15

 111  conditioned

matrices,

  15

Image  factor extraction, 38

Inadmissabil i ty,  20

Independence model,

  11 8

Intell igence,

  3

Internal replicabili ty

 methods,

  99-108

bootstrap,  EFA, 102-108

cross-validation, EFA, 101-103

external  vs.,

 100

and   sampling error, 99-100

Invariance, test ing  fo r  model,

153-162

with same model,  different  esti-

mates,  153-156

with same model, equal estimates,

155,

  157-159

with

  same model,  test  invar iance

 of

parameters,

  159—161

Inverse,

  matr ix ,

  15

IQ   tests,  3

Iterat ion,

  37

192 I N D E X

Page 183: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 183/185

Kl rule (eigenv alue greater  than  1.0

rule) ,

  32

Laten t var i ables . S ee

 Composite  v a r i a b l e s

L i b Q U A L + ™ s t u d y ,

 28, 39

LISREL, 111

Loadings, 19

Mahalanobis dis tance,  124

M a t r i x

  algebra,  9,

  12-15,

  25

Matrix(-ces) , 13-15

of  association coeffi cients ,

  28—31

comformable ,

  13-14

covar iance ,

  30—31

div i s ion

  of, 15, 25

ident i ty ,  14-15

  ill

 conditioned,

15

inverse of, 15

mul t ip l ica t ion

  of ,

  13—14,

  25

symmetr ic , 14

transpose of, 13

M a x i m u m

  likelihood factor extract ion,

38

Maximum likelihood theory,  126-127

M ean ,

  11

Measured (ob served) var i ables , 9

Measurement error

defined,

  99

sampl ing

  error vs.,

  5 0 — 5 2

M edian ,  11

Mediated ranking,  88

Model

proof

  of, 115

Model fit  statistics,

  127-130

Model specification error, 99-100

Model specif ica t ion searches, 131

Models

nested  r iva l ,  12 9

plausible, 115

Modes,  84

Modif ica t ion  indices ,

  131

Monte Car lo s imula t ions ,  24, 128

Mult ip l ica t ion  of  matrices ,  13-14, 25

Multivariate

  Behav ioral Research,   53

M ul t i v a r i a t e

  no rma l i t y ,

 121-123

Mult ivoca l models , 142

N a m i n g  of  factors , 97-98

N F1

  (normed  fit  index),  129

N orma l i t y

b i v a d a t e ,  1 2 2

m ul t i v a r i a t e ,  1 2 1 — 1 2 3

univar i a te ,

  121-122

Normed fi t index (NF1), 129

O bl imin ,

  44

Oblique rotat ions , 42-44,  67-81

and  higher-orde r factors, 72-81

a nd  m ax im iz in g  sample  f i t vs.

replicabi l i ty,

 69-72

Observed variables . S ee   Measured

va r i ab les

Occam's  razor, 70-71

OLS.

  Se e

  Ordinary leas t squares

One-factor model,

  118

Ordina ry leas t squares (OLS), 126, 138

Orthogonal

  ro ta t ion ,  42, 67

O-technique

  factor analysis , 85

Outlier screening,  123-126

Para l le l

  analysis ,

 34-36

Parameter es t imat ion theory,

  126—127

Parcels

I t em, 123

Parsimony,  5,  70-71

Pars imony-adjusted s ta t is t ics ,

  130

Pa t tern  coefficients

in

  CFA, 138, 140-143

in  GSM , 16-17

Pa t tern /s t ruc ture

  coefficients,  19

PD A (predic t ive d i sc iminant ana lys i s ) , 71

Pearson correlation coefficients, 10, 61

Pearson  K

factors  affecting,

  30, 124

Pearson product-moment biv aria te

corre la t ion mat r ix , 29

Pearson

  r,  2 9 — 3 1

Pencil test,  33

Pos tmul t ip l ica t ion ,  15

Predic t ive d i sc iminant a na lys i s (PD A ) ,

71

Principal-axes factor analysis ,

 37

i terat ion process for ,

  51

principal-components analysis vs . ,

53-55

Pr inc ipa l components  ana lys i s , 36-38

pr inc ipa l  axes analysis vs . , 5 3 — 5 5

Procrustean rotat ion,

  43-44

I N D E X 193

Page 184: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 184/185

Promax  rotat ion,  43-44

P-technique factor analysis,

Q

  sorts,

  87-88

Q-technique

  factor analysis, 84 ,

data collection

  in,

 87-88

uses of, 86

6-91

R .

  See

 Correlation  coefficient

Rank/ranking,

  13, 88

  R eflection

of

  factors,

 96-97

Regression method,

  44

Rel iab i l i ty

  coefficients,

  12, 99

Rel iab i l i ty generalizat ion (RG)  analysis,

38

Repl icab i l i ty

  analysis

external,

  100

internal,

  99-108

R eproduced correlat ion ma trix,

 16-17,

22-23

Resamples,

  10 3

Residua l

  correlation matrix,

  23-24

inspection

  of , 33-34

R G

  analysis ,

 38

R oot-mean-square error

  of

  approximation

( R M S E A ) ,  130

R otat ion, factor.

  See

 Factor

 rotation

R -techruque factor analysis,

 84

Salience

  of variables, 97

Sampling error, 100

measurement

  error

 vs., 50-52

Schmid-Leiman solution, 74

Scores,

  factor.

  See

  Factor scores

  Score

world statistics,

 11

Scree

  test,

  32-33

SD (standard deviat ion), 11

Second-order factors,

  72

SEM.

  Se e

  Structural equation modeling

Sequent ia l

  factor extraction,

  22-24

Simple structure, 41

Spearman's  rho, 29, 30

Specific  factors, 3

SPSS,

  22, 36, 44, 45, 51, 58, 61-63,  73 ,

74-81, 85

Standard deviat ion

  (SD), 11

Standard errors.

  Se e

 Error(s)

Standardized  regression coefficients, 1 6

Statist ical  significance tests,  31-32

S-technique factor ana lysis, 85

Structural equat ion modeling (SEM),

109-110

Structural

  models, 110

Structure coefficients

in

 CFA, 138, 140-143

in EFA, 61

in GSM, 18-19

Superscripts, 11

Suppressor  effect,

  72

Symmetric matrix,

  14

Synthetic variables.

  Se e

 C omposi te

variables

Third-order factors,

  72

Three-mode analysis, 84

Transapose(s), 13

T-technique  factor analysis, 85

t  test.  See

 C ri t ical rat io

Two-mode analysis,

  83-91

O-technique,  85

P-technique,

  84

Q-technique, 84,  86-91

R -technique ,

  84

S-technique,

  85

T-technique,  85

Uncorrelated factors model,

  118

Uniqueness,

  74

Unit  weight ing, 70

Univar ia te normal i ty ,  1 2 1 — 1 2 2

Univocal models,  140, 14 1

Val idi ty ,

  4-5

Variables

composite (latent, synthetic), 9, 10

measured

  (observed),

  9

number

  of, 10, 49-50

salience

  of, 97

Variance, 11

Visual

  scree test,

  33

W ald s ta t is t i c ,

 105,

 123

W eights/weight ing,

  9,  15-16

b e t a ( W , 16

uni t ,

  70

194

7NDEX

Page 185: Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-American Psychological Association (APA) (2004)

8/19/2019 Bruce Thompson-Exploratory and Confirmatory Factor Analysis_ Understanding Concepts and Applications-America…

http://slidepdf.com/reader/full/bruce-thompson-exploratory-and-confirmatory-factor-analysis-understanding 185/185

ABOUT

 THE

 AUTHOR

Bruce

  Thompson

  is a professor and distinguished research scholar at the

Department  of Educational Psychology at Texas A&M  University, ad junc t