malhotra 19

Upload: anuth-siddharth

Post on 14-Apr-2018

274 views

Category:

Documents


9 download

TRANSCRIPT

  • 7/29/2019 Malhotra 19

    1/37

    Chapter Nineteen

    Factor Analysis

  • 7/29/2019 Malhotra 19

    2/37

    19-2

    Chapter Outline

    1) Overview2) Basic Concept

    3) Factor Analysis Model

    4) Statistics Associated with Factor Analysis

  • 7/29/2019 Malhotra 19

    3/37

    19-3

    Chapter Outline

    5) Conducting Factor Analysisi. Problem Formulation

    ii. Construction of the Correlation Matrix

    iii. Method of Factor Analysis

    iv. Number of of Factors

    v. Rotation of Factors

    vi. Interpretation of Factors

    vii. Factor Scores

    viii. Selection of Surrogate Variables

    ix. Model Fit

  • 7/29/2019 Malhotra 19

    4/37

    19-4

    Chapter Outline

    6) Applications of Common Factor Analysis

    7) Internet and Computer Applications

    8) Focus on Burke

    9) Summary

    10) Key Terms and Concepts

  • 7/29/2019 Malhotra 19

    5/37

    19-5

    Factor Analysis

    Factor analysis is a general name denoting a class ofprocedures primarily used for data reduction andsummarization.

    Factor analysis is an interdependence technique in that anentire set of interdependent relationships is examined withoutmaking the distinction between dependent and independentvariables.

    Factor analysis is used in the following circumstances:

    To identify underlying dimensions, or factors, that explainthe correlations among a set of variables.

    To identify a new, smaller, set of uncorrelated variables toreplace the original set of correlated variables in subsequentmultivariate analysis (regression or discriminant analysis).

    To identify a smaller set of salient variables from a larger setfor use in subsequent multivariate analysis.

  • 7/29/2019 Malhotra 19

    6/37

    19-6

    Factor Analysis Model

    Mathematically, each variable is expressed as a linear combination

    of underlying factors. The covariation among the variables isdescribed in terms of a small number of common factors plus aunique factor for each variable. If the variables are standardized,the factor model may be represented as:

    Xi

    =Ai1

    F1

    +Ai2

    F2

    +Ai3

    F3

    + . . . +Aim

    Fm

    + Vi

    Ui

    where

    Xi = ith standardized variableAij = standardized multiple regression coefficient of

    variable ion common factorjF = common factorVi = standardized regression coefficient of variable ion

    unique factor iUi = the unique factor for variable im = number of common factors

  • 7/29/2019 Malhotra 19

    7/37

    19-7

    The unique factors are uncorrelated with each other and with thecommon factors. The common factors themselves can beexpressed as linear combinations of the observed variables.

    Fi = Wi1X1 + Wi2X2 + Wi3X3 + . . . + WikXk

    where

    Fi = estimate ofith factor

    Wi = weight or factor score coefficient

    k = number of variables

    Factor Analysis Model

  • 7/29/2019 Malhotra 19

    8/37

    19-8

    It is possible to select weights or factor scorecoefficients so that the first factor explains thelargest portion of the total variance.

    Then a second set of weights can be selected, sothat the second factor accounts for most of theresidual variance, subject to being uncorrelated withthe first factor.

    This same principle could be applied to selectingadditional weights for the additional factors.

    Factor Analysis Model

  • 7/29/2019 Malhotra 19

    9/37

    19-9

    Statistics Associated with Factor Analysis

    Bartlett's test of sphericity. Bartlett's test ofsphericity is a test statistic used to examine thehypothesis that the variables are uncorrelated in thepopulation. In other words, the populationcorrelation matrix is an identity matrix; each variable

    correlates perfectly with itself (r= 1) but has nocorrelation with the other variables (r= 0).

    Correlation matrix. A correlation matrix is a lowertriangle matrix showing the simple correlations, r,

    between all possible pairs of variables included in theanalysis. The diagonal elements, which are all 1, areusually omitted.

  • 7/29/2019 Malhotra 19

    10/37

  • 7/29/2019 Malhotra 19

    11/37

    19-11

    Factor scores. Factor scores are composite scoresestimated for each respondent on the derived factors.

    Kaiser-Meyer-Olkin (KMO) measure of samplingadequacy. The Kaiser-Meyer-Olkin (KMO) measure ofsampling adequacy is an index used to examine theappropriateness of factor analysis. High values (between

    0.5 and 1.0) indicate factor analysis is appropriate. Valuesbelow 0.5 imply that factor analysis may not beappropriate.

    Percentage of variance. The percentage of the totalvariance attributed to each factor.

    Residuals are the differences between the observedcorrelations, as given in the input correlation matrix, andthe reproduced correlations, as estimated from the factormatrix.

    Scree plot. A scree plot is a plot of the Eigenvalues

    against the number of factors in order of extraction.

    Statistics Associated with Factor Analysis

  • 7/29/2019 Malhotra 19

    12/37

    19-12

    Conducting Factor AnalysisRESPONDENT

    NUMBER V1 V2 V3 V4 V5 V61 7.00 3.00 6.00 4.00 2.00 4.00

    2 1.00 3.00 2.00 4.00 5.00 4.00

    3 6.00 2.00 7.00 4.00 1.00 3.00

    4 4.00 5.00 4.00 6.00 2.00 5.00

    5 1.00 2.00 2.00 3.00 6.00 2.00

    6 6.00 3.00 6.00 4.00 2.00 4.00

    7 5.00 3.00 6.00 3.00 4.00 3.00

    8 6.00 4.00 7.00 4.00 1.00 4.00

    9 3.00 4.00 2.00 3.00 6.00 3.00

    10 2.00 6.00 2.00 6.00 7.00 6.00

    11 6.00 4.00 7.00 3.00 2.00 3.00

    12 2.00 3.00 1.00 4.00 5.00 4.00

    13 7.00 2.00 6.00 4.00 1.00 3.00

    14 4.00 6.00 4.00 5.00 3.00 6.00

    15 1.00 3.00 2.00 2.00 6.00 4.00

    16 6.00 4.00 6.00 3.00 3.00 4.00

    17 5.00 3.00 6.00 3.00 3.00 4.00

    18 7.00 3.00 7.00 4.00 1.00 4.00

    19 2.00 4.00 3.00 3.00 6.00 3.00

    20 3.00 5.00 3.00 6.00 4.00 6.00

    21 1.00 3.00 2.00 3.00 5.00 3.00

    22 5.00 4.00 5.00 4.00 2.00 4.00

    23 2.00 2.00 1.00 5.00 4.00 4.00

    24 4.00 6.00 4.00 6.00 4.00 7.00

    25 6.00 5.00 4.00 2.00 1.00 4.00

    26 3.00 5.00 4.00 6.00 4.00 7.00

    27 4.00 4.00 7.00 2.00 2.00 5.00

    28 3.00 7.00 2.00 6.00 4.00 3.00

    29 4.00 6.00 3.00 7.00 2.00 7.0030 2.00 3.00 2.00 4.00 7.00 2.00

    Table 19.1

    19 13

  • 7/29/2019 Malhotra 19

    13/37

    19-13

    Conducting Factor AnalysisFig 19.1

    Construction of the Correlation Matrix

    Method of Factor Analysis

    Determination of Number of Factors

    Determination of Model Fit

    Problem formulation

    Calculation ofFactor Scores

    Interpretation of Factors

    Rotation of Factors

    Selection ofSurrogate Variables

    19 14

  • 7/29/2019 Malhotra 19

    14/37

    19-14

    Conducting Factor AnalysisFormulate the Problem

    The objectives of factor analysis should be identified. The variables to be included in the factor analysis

    should be specified based on past research, theory,and judgment of the researcher. It is important thatthe variables be appropriately measured on aninterval or ratio scale.

    An appropriate sample size should be used. As arough guideline, there should be at least four or fivetimes as many observations (sample size) as there

    are variables.

    19 15

  • 7/29/2019 Malhotra 19

    15/37

    19-15

    Correlation Matrix

    Variables V1 V2 V3 V4 V5 V6

    V1 1.000

    V2 -0.530 1.000

    V3 0.873 -0.155 1.000V4 -0.086 0.572 -0.248 1.000

    V5 -0.858 0.020 -0.778 -0.007 1.000

    V6 0.004 0.640 -0.018 0.640 -0.136 1.000

    Table 19.2

    19 16

  • 7/29/2019 Malhotra 19

    16/37

    19-16

    The analytical process is based on a matrix ofcorrelations between the variables.

    Bartlett's test of sphericity can be used to test thenull hypothesis that the variables are uncorrelated inthe population: in other words, the population

    correlation matrix is an identity matrix. If thishypothesis cannot be rejected, then theappropriateness of factor analysis should bequestioned.

    Another useful statistic is the Kaiser-Meyer-Olkin

    (KMO) measure of sampling adequacy. Small valuesof the KMO statistic indicate that the correlationsbetween pairs of variables cannot be explained byother variables and that factor analysis may not beappropriate.

    Conducting Factor AnalysisConstruct the Correlation Matrix

    19 17

    d l

  • 7/29/2019 Malhotra 19

    17/37

    19-17

    In principal components analysis, the total variance inthe data is considered. The diagonal of the correlationmatrix consists of unities, and full variance is brought intothe factor matrix. Principal components analysis isrecommended when the primary concern is to determinethe minimum number of factors that will account for

    maximum variance in the data for use in subsequentmultivariate analysis. The factors are called principalcomponents.

    In common factor analysis, the factors are estimated

    based only on the common variance. Communalities areinserted in the diagonal of the correlation matrix. Thismethod is appropriate when the primary concern is toidentify the underlying dimensions and the commonvariance is of interest. This method is also known as

    principal axis factoring.

    Conducting Factor AnalysisDetermine the Method of Factor Analysis

    19 18

  • 7/29/2019 Malhotra 19

    18/37

    19-18

    Results of Principal Components Analysis

    Communalities

    Variables Initial ExtractionV1 1.000 0.926V2 1.000 0.723V3 1.000 0.894

    V4 1.000 0.739V5 1.000 0.878V6 1.000 0.790

    Initial Eigen values

    Factor Eigen value % of variance Cumulat. %1 2.731 45.520 45.5202 2.218 36.969 82.4883 0.442 7.360 89.8484 0.341 5.688 95.5365 0.183 3.044 98.5806 0.085 1.420 100.000

    Table 19.3

    19 19

  • 7/29/2019 Malhotra 19

    19/37

    19-19

    Results of Principal Components Analysis

    Extraction Sums of Squared Loadings

    Factor Eigen value % of variance Cumulat. %1 2.731 45.520 45.5202 2.218 36.969 82.488

    Factor Matrix

    Variables Factor 1 Factor 2

    V1 0.928 0.253

    V2 -0.301 0.795

    V3 0.936 0.131

    V4 -0.342 0.789

    V5 -0.869 -0.351

    V6 -0.177 0.871

    Rotation Sums of Squared Loadings

    Factor Eigenvalue % of variance Cumulat. %1 2.688 44.802 44.802

    2 2.261 37.687 82.488

    Table 19.3 cont.

    19-20

  • 7/29/2019 Malhotra 19

    20/37

    19-20

    Results of Principal Components Analysis

    Rotated Factor Matrix

    Variables Factor 1 Factor 2V1 0.962 -0.027

    V2 -0.057 0.848V3 0.934 -0.146

    V4 -0.098 0.845

    V5 -0.933 -0.084V6 0.083 0.885

    Factor Score Coefficient Matrix

    Variables Factor 1 Factor 2V1 0.358 0.011

    V2 -0.001 0.375V3 0.345 -0.043V4 -0.017 0.377V5 -0.350 -0.059

    V6 0.052 0.395

    Table 19.3 cont.

    19-21

  • 7/29/2019 Malhotra 19

    21/37

    19 21

    Factor Score Coefficient Matrix

    Variables V1 V2 V3 V4 V5 V6

    V1 0.926 0.024 -0.029 0.031 0.038 -0.053V2 -0.078 0.723 0.022 -0.158 0.038 -0.105

    V3 0.902 -0.177 0.894 -0.031 0.081 0.033

    V4 -0.117 0.730 -0.217 0.739 -0.027 -0.107

    V5 -0.895 -0.018 -0.859 0.020 0.878 0.016

    V6 0.057 0.746 -0.051 0.748 -0.152 0.790

    The lower left triangle contains the reproducedcorrelation matrix; the diagonal, the communalities;the upper right triangle, the residuals between theobserved correlations and the reproducedcorrelations.

    Results of Principal Components Analysis

    Table 19.3 cont.

    19-22

    C d ti F t A l i

  • 7/29/2019 Malhotra 19

    22/37

    19 22

    A PrioriDetermination. Sometimes, because ofprior knowledge, the researcher knows how manyfactors to expect and thus can specify the number offactors to be extracted beforehand.

    Determination Based on Eigenvalues. In thisapproach, only factors with Eigenvalues greater than1.0 are retained. An Eigenvalue represents theamount of variance associated with the factor.Hence, only factors with a variance greater than 1.0

    are included. Factors with variance less than 1.0 areno better than a single variable, since, due tostandardization, each variable has a variance of 1.0.If the number of variables is less than 20, thisapproach will result in a conservative number of

    factors.

    Conducting Factor AnalysisDetermine the Number of Factors

    19-23

    C d ti F t A l i

  • 7/29/2019 Malhotra 19

    23/37

    19 23

    Determination Based on Scree Plot. A screeplot is a plot of the Eigenvalues against the numberof factors in order of extraction. Experimentalevidence indicates that the point at which the screebegins denotes the true number of factors.Generally, the number of factors determined by ascree plot will be one or a few more than thatdetermined by the Eigenvalue criterion.

    Determination Based on Percentage of

    Variance. In this approach the number of factorsextracted is determined so that the cumulativepercentage of variance extracted by the factorsreaches a satisfactory level. It is recommended thatthe factors extracted should account for at least 60%

    of the variance.

    Conducting Factor AnalysisDetermine the Number of Factors

    19-24

  • 7/29/2019 Malhotra 19

    24/37

    19 24

    Scree Plot

    0.5

    2 543 6

    Component Number

    0.0

    2.0

    3.0

    Eigenvalu

    e

    1.0

    1.5

    2.5

    1

    Fig 19.2

    19-25

    C d ti F t A l i

  • 7/29/2019 Malhotra 19

    25/37

    19 25

    Determination Based on Split-Half Reliability.The sample is split in half and factor analysis isperformed on each half. Only factors with highcorrespondence of factor loadings across the twosubsamples are retained.

    Determination Based on Significance Tests.It is possible to determine the statistical significanceof the separate Eigenvalues and retain only thosefactors that are statistically significant. A drawback is

    that with large samples (size greater than 200),many factors are likely to be statistically significant,although from a practical viewpoint many of theseaccount for only a small proportion of the totalvariance.

    Conducting Factor AnalysisDetermine the Number of Factors

    19-26

    Conducting Factor Analysis

  • 7/29/2019 Malhotra 19

    26/37

    Although the initial or unrotated factor matrixindicates the relationship between the factors andindividual variables, it seldom results in factors thatcan be interpreted, because the factors arecorrelated with many variables. Therefore, throughrotation the factor matrix is transformed into asimpler one that is easier to interpret.

    In rotating the factors, we would like each factor tohave nonzero, or significant, loadings or coefficientsfor only some of the variables. Likewise, we would

    like each variable to have nonzero or significantloadings with only a few factors, if possible with onlyone.

    The rotation is called orthogonal rotation if theaxes are maintained at right angles.

    Conducting Factor AnalysisRotate Factors

    19-27

    Conducting Factor Analysis

  • 7/29/2019 Malhotra 19

    27/37

    The most commonly used method for rotation is thevarimax procedure. This is an orthogonal methodof rotation that minimizes the number of variableswith high loadings on a factor, thereby enhancing theinterpretability of the factors. Orthogonal rotation

    results in factors that are uncorrelated. The rotation is called oblique rotation when the

    axes are not maintained at right angles, and thefactors are correlated. Sometimes, allowing for

    correlations among factors can simplify the factorpattern matrix. Oblique rotation should be usedwhen factors in the population are likely to bestrongly correlated.

    Conducting Factor AnalysisRotate Factors

    19-28

    Conducting Factor Analysis

  • 7/29/2019 Malhotra 19

    28/37

    A factor can then be interpreted in terms of thevariables that load high on it.

    Another useful aid in interpretation is to plot thevariables, using the factor loadings as coordinates.Variables at the end of an axis are those that have

    high loadings on only that factor, and hence describethe factor.

    Conducting Factor AnalysisInterpret Factors

    19-29

  • 7/29/2019 Malhotra 19

    29/37

    Factor Loading PlotFig 19.3

    1.0

    0.5

    0.0

    -0.5

    -1.0

    Component2

    Component 1

    ComponentVariable 1 2

    V1 0.962 -2.66E-02

    V2 -5.72E-02 0.848V3 0.934 -0.146

    V4 -9.83E-02 0.854

    V5 -0.933 -8.40E-02

    V6 8.337E-02 0.885

    Component Plot in Rotated Space

    1.0 0.5 0.0 -0.5 -1.0

    V1

    V3

    V6V2

    V5

    V4

    Rotated Component Matrix

    19-30

    Conducting Factor Analysis

  • 7/29/2019 Malhotra 19

    30/37

    The factor scores for the ith factor may be estimated

    as follows:

    Fi= Wi1X1+ Wi2X2+ Wi3X3+ . . . + WikXk

    Conducting Factor AnalysisCalculate Factor Scores

    19-31

    Conducting Factor Analysis

  • 7/29/2019 Malhotra 19

    31/37

    By examining the factor matrix, one could select foreach factor the variable with the highest loading onthat factor. That variable could then be used as asurrogate variable for the associated factor.

    However, the choice is not as easy if two or more

    variables have similarly high loadings. In such acase, the choice between these variables should bebased on theoretical and measurementconsiderations.

    Conducting Factor AnalysisSelect Surrogate Variables

    19-32

    Conducting Factor Analysis

  • 7/29/2019 Malhotra 19

    32/37

    The correlations between the variables can bededuced or reproduced from the estimatedcorrelations between the variables and the factors.

    The differences between the observed correlations(as given in the input correlation matrix) and the

    reproduced correlations (as estimated from the factormatrix) can be examined to determine model fit.These differences are called residuals.

    Conducting Factor AnalysisDetermine the Model Fit

    19-33

  • 7/29/2019 Malhotra 19

    33/37

    Results of Common Factor Analysis

    Communalities

    Variables Initial ExtractionV1 0.859 0.928V2 0.480 0.562V3 0.814 0.836V4 0.543 0.600

    V5 0.763 0.789V6 0.587 0.723

    Barlett test of sphericity Approx. Chi-Square = 111.314 df = 15 Significance = 0.00000

    Kaiser-Meyer-Olkin measure ofsampling adequacy = 0.660

    Initial Eigenvalues

    Factor Eigenvalue % of variance Cumulat. %1 2.731 45.520 45.520

    2 2.218 36.969 82.4883 0.442 7.360 89.8484 0.341 5.688 95.5365 0.183 3.044 98.5806 0.085 1.420 100.000

    Table 19.4

    19-34

  • 7/29/2019 Malhotra 19

    34/37

    Results of Common Factor AnalysisTable 19.4 cont.

    Extraction Sums of Squared Loadings

    Factor Eigenvalue % of variance Cumulat. %1 2.570 42.837 42.8372 1.868 31.126 73.964

    Factor Matrix

    Variables Factor 1 Factor 2V1 0.949 0.168V2 -0.206 0.720V3 0.914 0.038V4 -0.246 0.734V5 -0.850 -0.259V6 -0.101 0.844

    Rotation Sums of Squared Loadings

    Factor Eigenvalue % of variance Cumulat. %

    1 2.541 42.343 42.3432 1.897 31.621 73.964

    19-35

  • 7/29/2019 Malhotra 19

    35/37

    Rotated Factor Matrix

    Variables Factor 1 Factor 2V1 0.963 -0.030V2 -0.054 0.747V3 0.902 -0.150V4 -0.090 0.769V5 -0.885 -0.079V6 0.075 0.847

    Factor Score Coefficient Matrix

    Variables Factor 1 Factor 2

    V1 0.628 0.101V2 -0.024 0.253V3 0.217 -0.169V4 -0.023 0.271V5 -0.166 -0.059V6 0.083 0.500

    Results of Common Factor AnalysisTable 19.4 cont.

    19-36

  • 7/29/2019 Malhotra 19

    36/37

    Results of Common Factor AnalysisTable 19.4 cont.

    Factor Score Coefficient Matrix

    Variables V1 V2 V3 V4 V5 V6

    V1 0.928 0.022 -0.000 0.024 -0.008 -0.042

    V2 -0.075 0.562 0.006 -0.008 0.031 0.012V3 0.873 -0.161 0.836 -0.005 0.008 0.042

    V4 -0.110 0.580 -0.197 0.600 -0.025 -0.004

    V5 -0.850 -0.012 -0.786 0.019 0.789 0.003

    V6 0.046 0.629 -0.060 0.645 -0.133 0.723

    The lower left triangle contains the reproduced

    correlation matrix; the diagonal, the communalities;

    the upper right triangle, the residuals between the

    observed correlations and the reproduced correlations.

    19-37

  • 7/29/2019 Malhotra 19

    37/37

    SPSS Windows

    To select this procedures using SPSS for Windows click:

    Analyze>Data Reduction>Factor