multiple linear regression ii & analysis of variance i

Upload: arenguna

Post on 07-Apr-2018

230 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    1/81

    Lecture 9Survey Research & Design in Psychology

    James Neill, 2011

    Multiple Linear Regression II

    & Analysis of Variance I

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    2/81

    2

    Overview

    1. Multiple Linear Regression II2. Analysis of Variance I

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    3/81

    Multiple LinearRegression II

    Summary of MLR I

    Partial correlations Residual analysis Interactions Analysis of change

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    4/81

    4

    1. Howell (2009).Correlation & regression[Ch 9]

    2. Howell (2009).Multiple regression[Ch 15; not 15.14 Logistic Regression]

    3. Tabachnick & Fidell (2001). Standard & hierarchical regression inSPSS (includes example write-ups)

    [Alternative chapter from eReserve]

    Readings - MLR As perpreviouslecture

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    5/81

    5

    Summary of MLR I

    Check assumptions LOM, N,Normality, Linearity, Homoscedasticity,Collinearity, MVOs, Residuals

    Choose type Standard, Hierarchical,Stepwise, Forward, Backward

    Interpret Overall ( R 2, Changes in R 2 (ifhierarchical)), Coefficients (Standardised &unstandardised), Partial correlations

    Equation If useful (e.g., is the studypredictive?)

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    6/81

    6

    Partial correlations ( r p)

    r p between X and Y after controlling for(partialling out) the influence of a 3rd variablefrom both X and Y.

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    7/817

    Partial correlations ( r p): Examples

    Does years of marriage (IV 1)predict marital satisfaction (DV)

    after number of children iscontrolled for (IV2)?

    Does time management (IV 1)predict university studentsatisfaction (DV) after general lifesatisfaction is controlled for (IV

    2

    )?

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    8/81

    8

    Partial correlations ( r p) in MLR

    When interpreting MLR coefficients,compare the 0-order and partialcorrelations for each IV in each

    model draw a Venn diagram Partial correlations will be equal to or

    smaller than the 0-order correlations If a partial correlation is the same as

    the 0-order correlation, then the IV

    operates independently on the DV.

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    9/81

    9

    Partial correlations ( r p) in MLR

    To the extent that a partial correlationis smaller than the 0-ordercorrelation, then the IV's explanation

    of the DV is shared with other IVs. An IV may have a sig. 0-order

    correlation with the DV, but a non-sig. partial correlation. This wouldindicate that there is non-sig. uniquevariance explained by the IV.

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    10/81

    10

    Semi-partial correlations ( sr 2)in MLR

    The sr 2 indicate the %s of variance inthe DV which are uniquely explained

    by each IV . In PASW, the sr s are labelled part.

    You need to square these to get sr 2. For more info, see Allen and Bennett

    (2008) p. 182

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    11/8111

    Part & partial correlations in SPSSIn Linear Regression - Statistics dialog box,check Part and partial correlations

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    12/8112

    Coefficientsa

    -.521 -.460

    -.325 -.178

    Worry

    Ignore the Problem

    Model1

    Zero-order Partial

    Correlations

    Dependent Variable: Psychological Distressa.

    Multiple linear regression -Example

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    13/8113

    .18 .32.46

    .52

    .34

    Y

    X1X2

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    14/81

    Residual analysis

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    15/81

    15

    Residual analysis

    Three key assumptions can be testedusing plots of residuals:1. Linearity : IVs are linearly related to DV2. Normality of residuals

    3. Equal variances (Homoscedasticity)

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    16/81

    16

    Residual analysis

    Assumptions about residuals:

    Random noise Sometimes positive, sometimesnegative but, on average, 0

    Normally distributed about 0

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    17/81

    Residual analysis

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    18/81

    Residual analysis

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    19/81

    Residual analysisThe Normal P-P (Probability) Plot ofRegression Standardized

    Residuals can be used toassess the assumption ofnormally distributedresiduals.If the points cluster

    reasonably tightly along thediagonal line (as they dohere), the residuals arenormally distributed.Substantial deviations from

    the diagonal may be causefor concern.Allen & Bennett, 2008, p. 183

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    20/81

    Residual analysis

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    21/81

    The Scatterplot of standardised residuals against standardisedpredicted values can be used to assess the assumptions ofnormality, linearity and homoscedasticity of residuals. Theabsence of any clear patterns in the spread of points indicates

    that these assumptions are met.Allen & Bennett, 2008, p. 183

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    22/81

    22

    Why the big fussabout residuals?

    assumption violation

    Type I error rate(i.e., more false positives)

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    23/81

    23

    Why the big fussabout residuals?

    Standard error formulae (which areused for confidence intervals and sig. tests)

    work when residuals are well-behaved.

    If the residuals dont meet

    assumptions these formulae tend tounderestimate coefficient standarderrors giving overly optimistic p -

    values and too narrow CIs .

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    24/81

    Interactions

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    25/81

    25

    Interactions

    Additivity refers to the assumptionthat the IVs act independently, i.e.,they do not interact.

    However, there may also beinteraction effects - when themagnitude of the effect of one IV ona DV varies as a function of asecond IV.

    Also known as a moderation effect .

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    26/81

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    27/81

    27

    InteractionsPhysical exercise in naturalenvironments may providemultiplicative benefits in reducingstress e.g.,

    Natural environment StressPhysical exercise Stress

    Natural env. X Phys. ex. Stress

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    28/81

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    29/81

    29

    Interactions

    Y = b 1 x 1 + b 2 x 2 + b 12 x 12 + a + e b 12 is the product of the first two slopes

    ( b 1 x b 2 ) b 12 can be interpreted as the amount of

    change in the slope of the regression ofY on b 1 when b 2 changes by one unit.

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    30/81

    30

    Interactions Conduct Hierarchical MLR Step 1 :

    Pseudoephedrine

    Caffeine Step 2 : Pseudo x Caffeine (cross-product)

    Examine R 2, to see whether theinteraction term explains additionalvariance above and beyond the direct

    effects of Pseudo and Caffeine.

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    31/81

    31

    Interactions

    Possible effects of Pseudo and Caffeine onArousal: None

    Pseudo only (incr./decr.) Caffeine only (incr./decr.) Pseudo + Caffeine (additive inc./dec.) Pseudo x Caffeine (synergistic inc./dec.) Pseudo x Caffeine (antagonistic inc./dec.)

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    32/81

    32

    Interactions

    Cross-product interaction terms maybe highly correlated (multicollinear) with the corresponding simple IVs,

    creating problems with assessing therelative importance of main effectsand interaction effects.

    An alternative approach is to runseparate regressions for each level ofthe interacting variable .

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    33/81

    Interactions SPSS example

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    34/81

    Analysis of Change

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    35/81

    35

    Analysis of change

    Example research question:In group-based mental healthinterventions, does the quality of

    social support from groupmembers (IV1) and group leaders(IV2) explain changes inparticipants mental healthbetween the beginning and end ofthe intervention (DV)?

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    36/81

    36

    Analysis of change

    Hierarchical MLR DV = Mental health after the intervention

    Step 1 IV1 = Mental health before the intervention

    Step 2 IV2 = Support from group members IV3 = Support from group leader

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    37/81

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    38/81

    38

    Analysis of change

    Results of interest Change in R 2 how much

    variance in change scores isexplained by the predictors

    Regression coefficients for

    predictors in step 2

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    39/81

    39

    Summary (MLR II)

    Partial correlationUnique variance explained by IVs;calculate and report sr 2.

    Residual analysisA way to test key assumptions.

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    40/81

    40

    Summary (MLR II)

    InteractionsA way to model (rather than ignore)interactions between IVs.

    Analysis of changeUse hierarchical MLR to partial outbaseline scores in Step 1 in order touse IVs in Step 2 to predict changesover time.

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    41/81

    41

    Student questions

    ?

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    42/81

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    43/81

    ANOVA I

    Analysing differences t -tests

    One sample Independent Paired

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    44/81

    44

    Howell (2010): Ch3 The Normal Distribution Ch4 Sampling Distributions and

    Hypothesis Testing Ch7 Hypothesis Tests Applied to

    Means

    Readings Analysing differences

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    45/81

    Analysing differences

    Correlations vs. differences Which difference test? Parametric vs. non-parametric

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    46/81

    46

    Correlational vsdifference statistics

    Correlation and regressiontechniques reflect the

    strength of association Tests of differences reflect

    differences in central tendency of variables between groupsand measures.

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    47/81

    47

    Correlational vsdifference statistics

    In MLR we see the world asmade of covariation.

    Everywhere we look, we seerelationships . In ANOVA we see the world as

    made of differences.Everywhere we look we seedifferences .

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    48/81

    48

    Correlational vsdifference statistics

    LR/MLR e.g.,What is the relationshipbetween gender and height inhumans?

    t -test/ANOVA e.g.,What is the difference between the heights of human males and

    females? Are the differences in a sample

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    49/81

    Are the differences in a samplegeneralisable to a population?

    Whi h diff ? (2 )

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    50/81

    50

    How many groups?(i.e. categories of IV)

    More than 2 groups =ANOVA models

    2 groups:Are the groups

    independent ordependent?

    Independent groups Dependent groups

    1 group =one-sample t -test

    Para DV =Independent samples t-test Para DV =

    Paired samplest -test

    Non-para DV =Mann-Whitney U Non-para DV =

    Wilcoxon

    Which difference test? (2 groups)

    Parametric s

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    51/81

    51

    Parametric vs.non-parametric statistics

    Parametric statistics inferential test that assumes certain characteristicsare true of an underlying population,especially the shape of its distribution.

    Non-parametric statistics inferential test that makes few or noassumptions about the populationfrom which observations were drawn(distribution-free tests).

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    52/81

    52

    Parametric vs.non-parametric statistics

    There is generally at least onenon-parametric equivalent test for

    each type of parametric test. Non-parametric tests aregenerally used when

    assumptions about the underlyingpopulation are questionable (e.g.,non-normality).

    P t i

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    53/81

    53

    Parametric statistics commonlyused for normally distributed intervalor ratio dependent variables.

    Non-parametric statistics can beused to analyse DVs that are non-

    normal or are nominal or ordinal. Non-parametric statistics are less powerful that parametric tests.

    Parametric vs.non-parametric statistics

    S when d I

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    54/81

    54

    So, when do I use a non-parametric test?

    Consider non-parametric tests when(any of the following):

    Assumptions, like normality, havebeen violated.

    Small number of observations ( N). DVs have nominal or ordinal levels

    of measurement.

    Some Commonly Used

    Some commonly used parametric & non-parametric tests

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    55/81

    Some Commonly UsedParametric & Nonparametric

    Tests

    Compares groupsclassified by two

    different factors

    Friedman; 2 test of

    independence

    2-way ANOVA

    Compares three ormore groupsKruskal-Wallis1-way ANOVA

    Compares two relatedsamples

    Wilcoxon matchedpairs signed-rankt test (paired)

    Compares twoindependent samples

    Mann-Whitney U;Wilcoxon rank-sum

    t test(independent)

    PurposeNon-parametricParametric

    Some commonly used parametric & non parametric tests

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    56/81

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    57/81

    57

    Why a t -test or ANOVA?

    A t -test or ANOVA is used todetermine whether a sample ofscores are from the same population

    as another sample of scores. These are inferential tools for

    examining differences between

    group means. Is the difference between two sample

    means real or due to chance?

    t

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    58/81

    58

    t -tests One-sample

    One group of participants, compared with fixed, pre-existing value (e.g.,population norms)

    IndependentCompares mean scores on the samevariable across different populations

    (groups) Paired

    Same participants, with repeated

    measures

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    59/81

    Use of t in t tests

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    60/81

    60

    Use of t in t -tests t reflects the ratio of between group

    variance to within group variance Is the t large enough that it is unlikely

    that the two samples have come from

    the same population? Decision: Is t larger than the critical

    value for t ? (see t tables depends on critical and N)

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    61/81

    61

    68%95%

    99.7%

    Ye good ol normal distribution

    l l

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    62/81

    62

    One-tail vs. two-tail tests Two-tailed test rejects null hypothesis if

    obtained t -value is extreme is eitherdirection

    One-tailed test rejects null hypothesis ifobtained t -value is extreme is onedirection (you choose too high or too low)

    One-tailed tests are twice as powerful as two-tailed, but they are only focusedon identifying differences in onedirection.

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    63/81

    63

    One sample t -test

    Compare one group (a sample) with a fixed, pre-existing value(e.g., population norms)

    Do uni students sleep less than therecommended amount?e.g., Given a sample of N = 190 uni studentswho sleep M = 7.5 hrs/day ( SD = 1.5), doesthis differ significantly from 8 hours hrs/day ( = .05)?

    O l t t t

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    64/81

    One-sample t -test

    I d d

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    65/81

    65

    Independent groups t -test

    Compares mean scores on thesame variable across differentpopulations (groups)

    Do Americans vs. Non-Americansdiffer in their approval of Barack

    Obama? Do males & females differ in theamount of sleep they get?

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    66/81

    Do males and females differ in

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    67/81

    Do males and females differ inin amount of sleep per night?

    D l d f l diff i

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    68/81

    68

    Group Statistics

    1189 7.34 2.109 .061

    1330 8.24 2.252 .062

    gender_R Genderof respondent1 Male

    2 Female

    immrec immediaterecall-numbercorrect_wave 1

    N Mean Std. DeviationStd. Error

    Mean

    Independent Samples Test

    4.784 .029 -10.268 2517 .000 -.896 .087 -1.067 -.725

    -10.306 2511.570 .000 -.896 .087 -1.066 -.725

    F Sig.

    Levene's Test forEquality of Variances

    t df Sig. (2-tailed)Mean

    DifferenceStd. ErrorDifference Lower Upper

    95% Confidence

    Interval of theDifference

    t-test for Equality of Means

    Do males and females differ inmemory recall?

    Adolescents'

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    69/81

    69

    Group Statistics

    323 4.9995 .7565 4.209E-02168 4.9455 .7158 5.523E-02

    Type of SchoolSingle SexCo-Educational

    SSRN Mean Std. Deviation

    Std. ErrorMean

    Independent Samples Test

    .017 .897 .764 489 .445 5.401E-02 7.067E-02 -8.48E-02 .1929

    .778 355.220 .437 5.401E-02 6.944E-02 -8.26E-02 .1906

    Equal variancesassumedEqual variances

    not assumed

    SSRF Sig.

    Levene's Test for

    Equality of Variances

    t df Sig. (2-tailed)Mean

    DifferenceStd. ErrorDifference Lower Upper

    95% ConfidenceInterval of the

    Difference

    t-test for Equality of Means

    AdolescentsSame Sex Relations in

    Single Sex vs. Co-Ed Schools

    Adolescents'

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    70/81

    70

    Independent Samples Test

    .017 .897 .764 489 .445 5.401E-02 7.067E-02 -8.48E-02 .1929

    .778 355.220 .437 5.401E-02 6.944E-02 -8.26E-02 .1906

    Equal variancesassumedEqual variancesnot assumed

    SSRF Sig.

    Levene's Test for

    Equality of Variances

    t df Sig. (2-tailed)Mean

    DifferenceStd. ErrorDifference Lower Upper

    95% ConfidenceInterval of the

    Difference

    t-test for Equality of Means

    Group Statistics

    327 4.5327 1.0627 5.877E-02172 3.9827 1.1543 8.801E-02

    Type of SchoolSingle SexCo-Educational

    OSRN Mean Std. Deviation

    Std. ErrorMean

    AdolescentsOpposite Sex Relations in

    Single Sex vs. Co-Ed Schools

    Independent samples t test

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    71/81

    71

    Independent samples t -test 1-way ANOVA

    Comparison b/w means of 2 independent sample variables = t -test(e.g., what is the difference in EducationalSatisfaction between male and female students?)

    Comparison b/w means of 3+ independent sample variables = 1-way

    ANOVA(e.g., what is the difference in EducationalSatisfaction between students enrolled in fourdifferent faculties?)

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    72/81

    72

    Paired samples t -test

    Same participants, with repeatedmeasures

    Data is sampled within subjects Pre- vs. post- treatment ratings Different factors e.g., Voters

    approval ratings of candidates X andY

    Assumptions

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    73/81

    73

    Assumptions(Paired samples t -test)

    LOM: IV: Two measures from same participants(w/in subjects)

    a variable measured on two occasions or two different variables measured on the same

    occasion

    DV: Continuous (Interval or ratio)

    Normal distribution of differencescores (robust to violation with larger samples)

    Independence of observations (oneparticipants score is not dependent on anothers score)

    D i i h ff ?

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    74/81

    Does an intervention have an effect?

    There was no significant difference between pretestand posttest scores ( t (19) = 1.78, p = .09).

    Ad l t ' O it S

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    75/81

    75

    Paired Samples Test

    .7289 .9645 3.128E-02 .6675 .7903 23.305 950 .000SSR - OSRPair 1Mean Std. Deviation

    Std. ErrorMean Lower Upper

    95% ConfidenceInterval of the

    Difference

    Paired Differences

    t df Sig. (2-tailed)

    Paired Samples Statistics

    4.9787 951 .7560 2.451E-024.2498 951 1.1086 3.595E-02

    SSROSR

    Pair1

    Mean N Std. DeviationStd. Error

    Mean

    Adolescents' Opposite Sexvs. Same Sex Relations

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    76/81

    76

    Paired samples t -test

    1-way repeated measures ANOVA Comparison b/w means of 2 within

    subject variables = t -test Comparison b/w means of 3+ within

    subject variables = 1-way ANOVA(e.g., what is the difference in Campus, Social,and Education Satisfaction?)

    Summary

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    77/81

    77

    Su a y(Analysing Differences)

    Non-parametric and parametrictests can be used for examiningdifferences between the centraltendency of two of more variables

    Develop a conceptualisation of

    when to each of the parametrictests from one-sample t -testthrough to MANOVA (e.g. decisionchart).

    Summary

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    78/81

    78

    Summary(Analysing Differences)

    t -tests One-sample

    Independent-samples Paired samples

    What will be covered in

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    79/81

    79

    What will be covered inANOVA II?

    1. 1-way ANOVA2. 1-way repeated measures ANOVA

    3. Factorial ANOVA4. Mixed design ANOVA5. ANCOVA6. MANOVA7. Repeated measures MANOVA

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    80/81

    80

    References

    Allen, P. & Bennett, K. (2008). SPSS for thehealth and behavioural sciences . SouthMelbourne, Victoria, Australia: Thomson.

    Francis, G. (2007). Introduction to SPSS for Windows: v. 15.0 and 14.0 with Notes for Studentware (5th ed.). Sydney: PearsonEducation.

    Howell, D. C. (2010). Statistical methods for psychology (7th ed.). Belmont, CA:Wadsworth.

    Open Office Impress

  • 8/4/2019 Multiple Linear Regression II & Analysis of Variance I

    81/81

    Open Office Impress This presentation was made using

    Open Office Impress. Free and open source software. http://www.openoffice.org/product/impress.html

    http://www.openoffice.org/product/impress.htmlhttp://www.openoffice.org/product/impress.html