probability & statistical inference lecture 9

63
Probability & Statistical Inference Lecture 9 MSc in Computing (Data Analytics)

Upload: von

Post on 12-Feb-2016

116 views

Category:

Documents


0 download

DESCRIPTION

Probability & Statistical Inference Lecture 9. MSc in Computing (Data Analytics). Lecture Outline. Simple Linear Regression Multiple Regression. AVOVA vs Simple Linear Regression. AVOVA vs Simple Linear Regression. Scatter Plot. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Probability & Statistical Inference Lecture 9

Probability & Statistical Inference Lecture 9

MSc in Computing (Data Analytics)

Page 2: Probability & Statistical Inference Lecture 9

Lecture Outline Simple Linear Regression Multiple Regression

Page 3: Probability & Statistical Inference Lecture 9

AVOVA vs Simple Linear Regression

Type of Analysis

Explanatory

Response

Continuous

Categorical ANOVA

Continuous

Simple Linear

Regression

Page 4: Probability & Statistical Inference Lecture 9

AVOVA vs Simple Linear Regression

Page 5: Probability & Statistical Inference Lecture 9

Scatter Plot

A scatter plot or scattergraph is a type of chart using Cartesian coordinates to display values for two continuous variables for a set of data

Page 6: Probability & Statistical Inference Lecture 9

Describe Linear Relationship Correlation – You can quantify the

relationship between two variables with correlation statistics. Two variables are correlated if there is a linear relationship between them.

You can classify correlated variables according to the type of correlation: Positive: One variable tends to increase in

value as the other increases in value. Negative: One variable tends to decrease

in value as the other increases in value. Zero: No linear relationship between the

two variables (uncorrelated)

Page 7: Probability & Statistical Inference Lecture 9

Pearson Correlation Coefficient

Page 8: Probability & Statistical Inference Lecture 9

Caution using correlation

Four sets of data with the same correlation of 0.816

Page 10: Probability & Statistical Inference Lecture 9

Regression Analysis Introduction Many problems in engineering and science involve

exploring the relationships between two or more variables.

Regression analysis is a statistical technique that is very useful for these types of problems.

For example, in a chemical process, suppose that the yield of the product is related to the process-operating temperature.

Regression analysis can be used to build a model to predict yield at a given temperature level.

Page 11: Probability & Statistical Inference Lecture 9

Example

Page 12: Probability & Statistical Inference Lecture 9

Scatter Plot

Page 13: Probability & Statistical Inference Lecture 9

Regression Model Based on the scatter diagram, it is probably

reasonable to assume that the random variable Y is related to x by a straight-line relationship. We use the equation of a line to model the relationship. The simple linear regression model is given by:

where the slope and intercept of the line are called regression coefficients and where is the random error term.

Page 14: Probability & Statistical Inference Lecture 9

Regression Model

β1One unit change in x

Page 15: Probability & Statistical Inference Lecture 9

Regression Model

The true regression model is a line of mean values:

where 1 can be interpreted as the change in the mean of Y for a unit change in x.Also, the variability of Y at a particular value of x is determined by the error variance, 2.This implies there is a distribution of Y-values at each x and that the variance of this distribution is the same at each x.

Page 16: Probability & Statistical Inference Lecture 9

Regression Model

Page 17: Probability & Statistical Inference Lecture 9

Simple Linear Regression• The case of simple linear regression considers a

single regressor or predictor x and a dependent or response variable Y.

• The expected value of Y at each level of x is a random variable:

• We assume that each observation, Y, can be described by the model

Page 18: Probability & Statistical Inference Lecture 9

Suppose that we have n pairs of observations (x1, y1), (x2, y2), …, (xn, yn).

Deviations of the data from the estimated regression model.

Simple Linear Regression

Page 19: Probability & Statistical Inference Lecture 9

The method of least squares is used to estimate the parameters, 0 and 1 by minimizing the sum of the squares of the vertical deviations in diagram below

Deviations of the data from the estimated regression model.

Simple Linear Regression

Page 20: Probability & Statistical Inference Lecture 9

Least Squares Estimator

Page 21: Probability & Statistical Inference Lecture 9

Model Estimates

Page 22: Probability & Statistical Inference Lecture 9

Notation

Page 23: Probability & Statistical Inference Lecture 9

Example

Page 24: Probability & Statistical Inference Lecture 9

Example

Page 25: Probability & Statistical Inference Lecture 9

Scatter plot of oxygen purity y versus hydrocarbon level x and regression model ŷ = 74.20 + 14.97x.

Example

Page 26: Probability & Statistical Inference Lecture 9

Demo

Page 27: Probability & Statistical Inference Lecture 9

Model Assumptions Fitting a regression model requires several

assumptions.1. Errors are uncorrelated random variables with

mean zero;2. Errors have constant variance; and,3. Errors be normally distributed.

The analyst should always consider the validity of these assumptions to be doubtful and conduct analyses to examine the adequacy of the model

Page 28: Probability & Statistical Inference Lecture 9

• The residuals from a regression model are ei = yi - ŷi , where yi is an actual observation and ŷi is the corresponding fitted value from the regression model.

• Analysis of the residuals is frequently helpful in checking the assumption that the errors are approximately normally distributed with constant variance, and in determining whether additional terms in the model would be useful.

Testing Assumptions – Residual Analysis

Page 29: Probability & Statistical Inference Lecture 9

Patterns for residual plots. (a) satisfactory, (b) funnel, (c) double bow, (d) nonlinear. [Adapted from Montgomery, Peck, and Vining (2001).]

Residual Analysis

Page 30: Probability & Statistical Inference Lecture 9

Example - Residual Analysis

Page 31: Probability & Statistical Inference Lecture 9

Normal probability plot of residuals

Example - Residual Analysis

Page 32: Probability & Statistical Inference Lecture 9

Plot of residuals versus predicted oxygen purity, ŷ

Example - Residual Analysis

Page 33: Probability & Statistical Inference Lecture 9

Adequacy of the Regression Model • The quantity

is called the coefficient of determination and is often used to judge the adequacy of a regression model.

• 0 R2 1;• We often refer (loosely) to R2 as the amount of

variability in the data explained or accounted for by the regression model.

Page 34: Probability & Statistical Inference Lecture 9

Adequacy of the Regression Model • For the oxygen purity regression model, R2 = SSR/SST

= 152.13/173.38 = 0.877

• Thus, the model accounts for 87.7% of the variability in the data.

Page 35: Probability & Statistical Inference Lecture 9

Multiple Linear Regression

Page 36: Probability & Statistical Inference Lecture 9

Introduction Many applications of regression analysis involve

situations in which there are more than one regressor variable.

A regression model that contains more than one regressor variable is called a multiple regression model.

Page 37: Probability & Statistical Inference Lecture 9

Introduction For example, suppose that the effective life of

a cutting tool depends on the cutting speed and the tool angle. A possible multiple regression model could be:

where:Y – tool lifex1 – cutting speedx2 – tool angle

Page 38: Probability & Statistical Inference Lecture 9

Introduction

The regression plane for the model:E(Y) = 50 + 10x1 + 7x2

The contour plot

Page 39: Probability & Statistical Inference Lecture 9

Introduction

Page 40: Probability & Statistical Inference Lecture 9

Demo

Page 41: Probability & Statistical Inference Lecture 9

Regression & Variable Selection How do we select the best variable

for use in a regression model Perform a search to see which

variable are the most effective Three search schemes:

Forward sequential selection Backward sequential selection Stepwise sequential selection

Page 42: Probability & Statistical Inference Lecture 9

Sequential Selection – Forward

Entry CutoffInput p-value

Page 43: Probability & Statistical Inference Lecture 9

Sequential Selection – Forward

Entry CutoffInput p-value

Page 44: Probability & Statistical Inference Lecture 9

Sequential Selection – Forward

Entry CutoffInput p-value

Page 45: Probability & Statistical Inference Lecture 9

Sequential Selection – Forward

Entry CutoffInput p-value

Page 46: Probability & Statistical Inference Lecture 9

Sequential Selection – Backward

Stay CutoffInput p-value

Page 47: Probability & Statistical Inference Lecture 9

Sequential Selection – Backward

Stay CutoffInput p-value

Page 48: Probability & Statistical Inference Lecture 9

Sequential Selection – Backward

Stay CutoffInput p-value

Page 49: Probability & Statistical Inference Lecture 9

Sequential Selection – Backward

Stay CutoffInput p-value

Page 50: Probability & Statistical Inference Lecture 9

Sequential Selection – Backward

Stay CutoffInput p-value

Page 51: Probability & Statistical Inference Lecture 9

Sequential Selection – Backward

Stay CutoffInput p-value

Page 52: Probability & Statistical Inference Lecture 9

Sequential Selection – Backward

Stay CutoffInput p-value

Page 53: Probability & Statistical Inference Lecture 9

Sequential Selection – Backward

Stay CutoffInput p-value

Page 54: Probability & Statistical Inference Lecture 9

Sequential Selection – Stepwise

Input p-value Entry Cutoff

Stay Cutoff

Page 55: Probability & Statistical Inference Lecture 9

Sequential Selection – Stepwise

Input p-value Entry Cutoff

Stay Cutoff

Page 56: Probability & Statistical Inference Lecture 9

Sequential Selection – Stepwise

Input p-value Entry Cutoff

Stay Cutoff

Page 57: Probability & Statistical Inference Lecture 9

Sequential Selection – Stepwise

Input p-value Entry Cutoff

Stay Cutoff

Page 58: Probability & Statistical Inference Lecture 9

Sequential Selection – Stepwise

Input p-value Entry Cutoff

Stay Cutoff

Page 59: Probability & Statistical Inference Lecture 9

Sequential Selection – Stepwise

Input p-value Entry Cutoff

Stay Cutoff

Page 60: Probability & Statistical Inference Lecture 9

Sequential Selection – Stepwise

Input p-value Entry Cutoff

Stay Cutoff

Page 61: Probability & Statistical Inference Lecture 9

Demo

Page 62: Probability & Statistical Inference Lecture 9

Multi-Collinearity Multi-Collinearity exists when two or

more independent variables are used in regression are correlated.

X3

Y

X2

X1

Page 63: Probability & Statistical Inference Lecture 9

Demo