Hello, I need to see the steps so that I can understand my mistakes please! Use the “AgeSaving” dataset used in the previous problem. Fit the following six models: E(Y)= ß_0+ß_1 X_1 (whereY=savings,...

Hello, I need to see the steps so that I can understand my mistakes please! Use the “AgeSaving” dataset used in the previous problem. Fit the following six models: E(Y)= ß_0+ß_1 X_1 (whereY=savings, X1=age) E(Y)= ß_0+ß_2 X_2 (whereY=savings, X2=income) E(Y)= ß_0+ß_1 X_1+ß_2 X_2 (whereY=savings, X1=age,X2=income) E(Y)= ß_0+ß_1 X_1 (where Y=savingsnew,X1=agenew) E(Y)= ß_0+ß_2 X_2 (where Y=savingsnew,X2=incomenew) E(Y)= ß_0+ß_1 X_1+ß_2 X_2 (where Y=savingsnew,X1=agenew,X2=incomenew) Use the model results to complete the following table (the first row has been completed for you): Model b1 se(b1) b2 se(b2) savings vs age 1.5247 0.0694 XXX XXX savings vs income XXX XXX savings vs age, income savingsnew vs agenew XXX XXX savingsnew vs incomenew XXX XXX savingsnew vs agenew, incomenew The first three models use data in which the two predictors are sufficiently highly correlated to create data-based multicollinearity (as explored in the previous problem). Briefly describe how the results in the first three rows of the table illustrate how: The estimated regression coefficient of any one variable depends on which other predictor variables are included in the model; The precision of the estimated regression coefficients decreases as more predictor variables are added to the model. The last three models use additional data such that the correlation between the two predictors has been reduced. The hope is that the data-based multicollinearity has been mitigated. Do the results in the last three rows of the table support the following assertions? The estimated regression coefficient of any one variable no longer depends on which other predictor variables are included in the model; The precision of the estimated regression coefficients remains approximately the same as more predictor variables are added to the model. [Note: You should find that while the results support assertion (ii) to some extent, they don’t particularly support assertion (i) at all. The take-home message here is that in most observational studies there will be varying degrees of correlation between predictors that we can’t do anything about. This means that it is almost always the case that the estimated regression coefficient of any one variable depends on which other predictor variables are included in the model. The only time this is not the case is in datasets where the correlation between predictors is close to zero (which generally only occurs in designed experiments). The rest of the time we need to be careful to include all relevant predictor variables and interpret regression coefficients correctly (that is with reference to all the other predictor variables in the model).]
May 14, 2022
SOLUTION.PDF

Get Answer To This Question

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here