Multiple Regression Analysis

October 8, 2017 | Author: MD Abdul Moid Mukarram | Category: Correlation And Dependence, Dependent And Independent Variables, Regression Analysis, F Test, Linear Regression
Share Embed Donate


Short Description

Regression Co Efficient...

Description

CORRELATION AND MULTIPLE REGRESSION ANALYSIS

The application of correlation analysis is to measure the degree of association between two sets of quantitative data.  The degree of relationship between two variables is known as correlation coefficient  It has a value ranging from 0 (no correlation) to 1 (perfect positive correlation), or -1 (perfect negative correlation).  Correlation can be done for any two sets of data even with out any relevance. 

• •



• •

How are sales of seven up correlated with sales of Mirinda? How is the advertising expenditure correlated with other promotional expenditure? Are daily ice cream sales correlated with daily maximum temperature? Correlation does not necessarily mean there is a causal effect. Given any two strings of numbers, there will be some correlation among them. It does not imply that one variable is causing a change in another, or is dependent upon another.

Positive Correlation wt 90 80 70 60 50 40 30 20 10 0 0

50

100

150

200

Negative Correlation 80 70 60 50 40 30 20 10 0 0

20

40

60

80

100

REGRESSION

In regression analysis, we have dependent variable and independent variable  Prediction one (dependent) variable with other variable (Independent) variables.  The main objective of regression analysis is to explain the variation in one variable (called the dependent variable), based on the variation in one or more other variables (called the independent variables). 

Regression analysis examines associative relationships between a metric dependent variable and one or more independent variables in the following ways: •

Determine whether the independent variables explain a significant variation in the dependent variable: whether a relationship exists.



Determine how much of the variation in the dependent variable can be explained by the independent variables: strength of the relationship. The applications areas are in ‘explaining’ variations in sales of a product based on advertising expenses, or number of sales people, or number of sales offices, or on all the above variables.



Determine the structure or form of the relationship: the mathematical equation relating the independent and dependent variables.



Predict the values of the dependent variable.



If there is only one dependent variable and one independent variable is used to explain the variation in it, then the model is known as a simple linear regression.



If multiple independent variables are used to explain the variation in a dependent variable, it is called a multiple regression model.

Y = a + b1x1 + b2x2 +…….+ bnxn where y is the dependent variable and x1, x2 , x3….xn are the independent variables expected to be related to y and expected to explain or predict y. b1, b2, b3…bn are the partial regression coefficients of the respective independent variables, which will be determined from the input data. 

The interpretation of the partial regression coefficient, b1, is that it represents the expected change in Y when X1 is changed by one unit but X2 is held constant or otherwise controlled.  If you increase one unit in x1, Y will get increase or decrease b1 times  Likewise, b2 represents the expected change in Y for a unit change in X2, when X1 is held constant. Thus, calling b1 and b2 partial regression coefficients is appropriate.  It can also be seen that the combined effects of X1 and X2 on Y are additive. In other words, if X1 and X2 are each changed by one unit, the expected change in Y would be (b1+b2).

Statistics in Multiple Regression analysis •





F test. The F test is used to test the null hypothesis. This is equivalent to testing the null hypothesis .i.e. ‘P' value should be less than 0.05 (95% confidence) or 0.1 (90% confidence level). Adjusted R2. R2, coefficient of multiple determination: Percentage or proportion of the total variance in Y explained by all the independent variables in the regression equation. It indicates how well the independent variables can predict the dependent variable . The output also gives you the results of a ‘t’ test for the significance of each variable in the model. It will test whether each individual independent variable can predict the dependent variable .

Regression Model establishes the relationship between loyalty and satisfaction & purchase  Loyalty y = a + B1 X1+ B2X2.  X1 = Satisfaction  X2= Purchase 

Multiple Regression Table 17.3 Multiple R R2 Adjusted R2 Standard Error

0.97210 0.94498 0.93276 0.85974

df Regression Residual F = 77.29364

2 9

ANALYSIS OF VARIANCE Sum of Squares Mean Square 114.26425 57.13213 6.65241 0.73916 Significance of F = 0.0000 VARIABLES IN THE EQUATION SEb Beta (ß) T

Variable

b

satisfaction Purchase (Constant)

0.28865 0.08608 0.48108 0.05895 0.33732 0.56736

0.31382 0.76363

Significance of T 3.353 0.0085 8.160 0.0000 0.595 0.5668



Loyalty y = 0.33732 + 0.28865 X1+ 0.48108X2. (satisfaction)

(purchase)

View more...

Comments

Copyright ©2017 KUPDF Inc.
SUPPORT KUPDF