To examine the dynamics involved in your group work, under separate cover, each member is required to provide a report describing in detail the contribution of the other members to overall workings of the group. The report should include, but not limited to, how the team worked together, whether any members shirk their responsibilities and became free-riders, whether there were any major or learning issues encountered etc. If applicable, discuss how the issues were resolved. In addition to others, the report should answer questions such as: How were duties assigned? Were duties evenly distributed? What specific duties did you perform? Did you participate fully in the report preparation? If not, why not? Did all members participate fully? If not, who not?
Finally, the report should contain grades you would give each group member (including yourself) for overall performance in the group?
Sheet1 Descriptive Statistics SALYRSGPPTSPMPOS Mean2215.1435.868228366.9312174.75114.5139090.647145 Median14005302100-11 Maximum953822162815244321 Minimum500010-1180 Std. Dev.1825.6284.506128301.8727214.325541.391460.478208 Skewness1.2200040.8529751.0352882.241012.551545-0.615852 Kurtosis3.9058993.1034963.6999369.51668822.243661.379274 Jarque-Bera192.785283.12613135.9511780.23211279.74117.927 Probability000000 Sum151294340082506141193553083442 Sum Sq. Dev.2.27E+0913848.1462148708313279601168439155.9619 Observations683683683683683683 Correlation Matrix of Variables YRSGPPTSPMPOS YRS10.9571890.7662740.1741950.00697 GP0.95718910.8468530.2350610.017942 PTS0.7662740.84685310.3538780.191403 PM0.1741950.2350610.3538781-0.031716 POS0.006970.0179420.191403-0.0317161 Regression 1 Dependent Variable: SAL Method: Least Squares Date: 03/09/11 Time: 10:40 Sample: 1 683 Included observations: 683 VariableCoefficientStd. Errort-StatisticProb. Constant1825.332113.654916.06030 YRS-267.680640.30237-6.6418060 GP3.5895120.734174.8892130 PTS5.5619030.50896810.927790 PM3.6241791.3238572.7375910.0064 POS-532.7853112.2929-4.7446020 R-squared0.492056Mean dependent var2215.143 Adjusted R-squared0.488305S.D. dependent var1825.628 S.E. of regression1305.924Akaike info criterion17.19596 Sum squared resid1.15E+09Schwarz criterion17.23572 Log likelihood-5866.419Hannan-Quinn criter.17.21134 F-statistic131.1649Durbin-Watson stat0.91174 Prob(F-statistic)0 Heteroskedasticity Test: Breusch-Pagan-Godfrey F-statistic69.79623Prob. F(5,677)0 Obs*R-squared232.3182Prob. Chi-Square(5)0 Scaled explained SS735.7594Prob. Chi-Square(5)0 Test Equation: Dependent Variable: RESID^2 Method: Least Squares Date: 03/09/11 Time: 11:13 Sample: 1 683 Included observations: 683 VariableCoefficientStd. Errort-StatisticProb. C1320936304781.14.3340480 YRS-75235.67108076.3-0.6961350.4866 GP-4608.7971968.777-2.3409450.0195 PTS18611.161364.86813.635860 PM-8676.123550.104-2.4439060.0148 POS-1098712301128.8-3.6486440.0003 R-squared0.340144Mean dependent var1690456 Adjusted R-squared0.33527S.D. dependent var4295318 S.E. of regression3502014Akaike info criterion32.98432 Sum squared resid8.30E+15Schwarz criterion33.02408 Log likelihood-11258.15Hannan-Quinn criter.32.99971 F-statistic69.79623Durbin-Watson stat1.920121 Prob(F-statistic)0 Heteroskedasticity Test: Harvey F-statistic24.82372Prob. F(5,677)0 Obs*R-squared105.8183Prob. Chi-Square(5)0 Scaled explained SS111.2594Prob. Chi-Square(5)0 Test Equation: Dependent Variable: LRESID2 Method: Least Squares Date: 03/09/11 Time: 11:14 Sample: 1 683 Included observations: 683 VariableCoefficientStd. Errort-StatisticProb. C13.396370.18304473.186690 YRS-0.2597680.064908-4.0020990.0001 GP0.0022490.0011821.9018410.0576 PTS0.0052420.000826.3951190 PM-0.0042890.002132-2.0117440.0446 POS-1.1347170.18085-6.2743440 R-squared0.154932Mean dependent var12.8595 Adjusted R-squared0.14869S.D. dependent var2.279507 S.E. of regression2.10322Akaike info criterion4.333562 Sum squared resid2994.734Schwarz criterion4.373327 Log likelihood-1473.912Hannan-Quinn criter.4.348951 F-statistic24.82372Durbin-Watson stat1.728548 Prob(F-statistic)0 Heteroskedasticity Test: Glejser F-statistic72.43531Prob. F(5,677)0 Obs*R-squared238.041Prob. Chi-Square(5)0 Scaled explained SS296.9904Prob. Chi-Square(5)0 Test Equation: Dependent Variable: ARESID Method: Least Squares Date: 03/09/11 Time: 11:14 Sample: 1 683 Included observations: 683 VariableCoefficientStd. Errort-StatisticProb. C960.375862.0411615.479660 YRS-63.12721.99998-2.8694110.0042 GP-0.0481590.400764-0.1201690.9044 PTS3.55120.27783212.781820 PM-1.8370710.722658-2.5421020.0112 POS-350.472161.29769-5.7175410 R-squared0.348523Mean dependent var957.7385 Adjusted R-squared0.343711S.D. dependent var879.9584 S.E. of regression712.8689Akaike info criterion15.98522 Sum squared resid3.44E+08Schwarz criterion16.02498 Log likelihood-5452.952Hannan-Quinn criter.16.00061 F-statistic72.43531Durbin-Watson stat1.811922 Prob(F-statistic)0 Heteroskedasticity Test: ARCH F-statistic29.41993Prob. F(1,680)0 Obs*R-squared28.28282Prob. Chi-Square(1)0 Test Equation: Dependent Variable: RESID^2 Method: Least Squares Date: 03/09/11 Time: 11:15 Sample (adjusted): 2 683 Included observations: 682 after adjustments VariableCoefficientStd. Errort-StatisticProb. C1324142171121.17.7380430 RESID^2(-1)0.2010590.0370685.4240140 R-squared0.04147Mean dependent var1664318 Adjusted R-squared0.040061S.D. dependent var4243764 S.E. of regression4157891Akaike info criterion33.32184 Sum squared resid1.18E+16Schwarz criterion33.33511 Log likelihood-11360.75Hannan-Quinn criter.33.32698 F-statistic29.41993Durbin-Watson stat2.082468 Prob(F-statistic)0 Heteroskedasticity Test: White F-statistic34.60145Prob. F(19,663)0 Obs*R-squared340.0588Prob. Chi-Square(19)0 Scaled explained SS1076.977Prob. Chi-Square(19)0 Test Equation: Dependent Variable: RESID^2 Method: Least Squares Date: 03/09/11 Time: 11:15 Sample: 1 683 Included observations: 683 Collinear test regressors dropped from specification VariableCoefficientStd. Errort-StatisticProb. C1282039396134.83.236370.0013 YRS-165774.7247001.1-0.671150.5024 YRS^211690.0341249.30.2833990.777 YRS*GP765.31041468.8990.5210090.6025 YRS*PTS-2432.391009.58-2.4093090.0163 YRS*PM-1635.5362781.015-0.5881080.5567 YRS*POS-165097.4221369.6-0.74580.4561 GP-2914.4174795.208-0.6077770.5435 GP^2-3.46064514.81785-0.2335460.8154 GP*PTS-5.37444320.03253-0.2682860.7886 GP*PM45.3954348.80660.9301090.3527 GP*POS4252.064478.2220.9494980.3427 PTS11865.435048.482.3502970.0191 PTS^248.098457.6096066.3207540 PTS*PM-45.1381226.36385-1.7121210.0873 PTS*POS-7439.5934631.601-1.6062680.1087 PM2809.57511152.070.2519330.8012 PM^2-54.4026836.32967-1.4974720.1347 PM*POS-2746.5179397.118-0.2922720.7702 POS-185852.1433320.5-0.4289020.6681 R-squared0.49789Mean dependent var1690456 Adjusted R-squared0.483501S.D. dependent var4295318 S.E. of regression3086955Akaike info criterion32.75211 Sum squared resid6.32E+15Schwarz criterion32.88466 Log likelihood-11164.85Hannan-Quinn criter.32.80341 F-statistic34.60145Durbin-Watson stat1.924307 Prob(F-statistic)0 Autocorrelation Breusch-Godfrey Serial Correlation LM Test: F-statistic387.2024Prob. F(1,676)0 Obs*R-squared248.7384Prob. Chi-Square(1)0 Test Equation: Dependent Variable: RESID Method: Least Squares Date: 04/26/11 Time: 13:50 Sample: 1 683 Included observations: 683 Presample missing value lagged residuals set to zero. VariableCoefficientStd. Errort-StatisticProb. C103.076690.844321.1346520.2569 YRS86.4436332.458732.6631860.0079 GP-0.9366960.587776-1.5936270.1115 PTS-2.0738270.419593-4.9424750 PM-0.5580881.056779-0.5281030.5976 POS150.582889.932531.6743980.0945 RESID(-1)0.6802390.03456919.677460 R-squared0.364185Mean dependent var-1.84E-13 Adjusted R-squared0.358542S.D. dependent var1301.128 S.E. of regression1042.088Akaike info criterion16.74604 Sum squared resid7.34E+08Schwarz criterion16.79243 Log likelihood-5711.771Hannan-Quinn criter.16.76399 F-statistic64.53374Durbin-Watson stat2.017892 Prob(F-statistic)0 Multicolinearity Dependent Variable: YRS Method: Least Squares Date: 04/26/11 Time: 14:08 Sample: 1 683 Included observations: 683 VariableCoefficientStd. Errort-StatisticProb. C0.3655250.107393.403720.0007 GP0.0162890.00031352.000730 PTS-0.0031680.00047-6.7466180 PM-0.0031030.001256-2.4707270.0137 POS0.1443980.1068621.3512610.1771 R-squared0.92418Mean dependent var5.868228 Adjusted R-squared0.923733S.D. dependent var4.506128 S.E. of regression1.244436Akaike info criterion3.282535 Sum squared resid1049.965Schwarz criterion3.315672 Log likelihood-1115.986Hannan-Quinn criter.3.295359 F-statistic2066.061Durbin-Watson stat1.803006 Prob(F-statistic)0 Dependent Variable: GP Method: Least Squares Date: 04/26/11 Time: 14:11 Sample: 1 683 Included observations: 683 VariableCoefficientStd. Errort-StatisticProb. C24.360555.871274.1491110 YRS49.085350.94393652.000730 PTS0.4123820.02140219.268640 PM0.0178550.0692480.2578440.7966 POS-27.224515.780297-4.7098820 R-squared0.949089Mean dependent var366.9312 Adjusted R-squared0.948789S.D. dependent var301.8727 S.E. of regression68.31351Akaike info criterion11.29339 Sum squared resid3164047Schwarz criterion11.32652 Log likelihood-3851.691Hannan-Quinn criter.11.30621 F-statistic3159.846Durbin-Watson stat1.80872 Prob(F-statistic)0 Dependent Variable: PTS Method: Least Squares Date: 04/26/11 Time: 14:12 Sample: 1 683 Included observations: 683 VariableCoefficientStd. Errort-StatisticProb. C-78.439588.029453-9.7689810 YRS-19.861022.943848-6.7466180 GP0.8580470.04453119.268640 PM0.7671790.0954498.0375710 POS79.476137.90433810.054750 R-squared0.789854Mean dependent var174.7511 Adjusted R-squared0.788614S.D. dependent var214.3255 S.E. of regression98.53993Akaike info criterion12.02609 Sum squared resid6583459Schwarz criterion12.05923 Log likelihood-4101.911Hannan-Quinn criter.12.03892 F-statistic637.0804Durbin-Watson stat1.721919 Prob(F-statistic)0 Dependent Variable: PM Method: Least Squares Date: 04/26/11 Time: 14:13 Sample: 1 683 Included observations: 683 VariableCoefficientStd. Errort-StatisticProb. C7.5482773.2843322.2982690.0218 YRS-2.8757621.163934-2.4707270.0137 GP0.0054910.0212970.2578440.7966 PTS0.1133960.0141088.0375710 POS-12.346053.222899-3.8307270.0001 R-squared0.167187Mean dependent var4.513909 Adjusted R-squared0.162273S.D. dependent var41.39146 S.E. of regression37.88454Akaike info criterion10.11426 Sum squared resid973091.4Schwarz criterion10.14739 Log likelihood-3449.019Hannan-Quinn criter.10.12708 F-statistic34.02698Durbin-Watson stat1.662007 Prob(F-statistic)0 Sheet2 Sheet3 Lecture 12 Heteroskedasticity Review * Review of Standard Errors (cont.) Problem: we do not know s 2 Solution: estimate s 2 We do not observe the ACTUAL error terms, ei We DO observe the residual, ei * Review of Standard Errors (cont.) Our formula for Estimated Standard Errors relied on ALL the Gauss–Markov DGP assumptions. For this lecture, we will focus on the assumption of homoskedasticity. What happens if we relax the assumption that? * Heteroskedasticity (Chapter 10.1) HETEROSKEDASTICITY The variance of ei is NOT a constant s 2. The variance of ei is greater for some observations than for others. * Heteroskedasticity (cont.) For example, consider a regression of housing expenditures on income. Consumers with low values of income have little scope for varying their rent expenditures. Var(ei ) is low. Wealthy consumers can choose to spend a lot of money on rent, or to spend less, depending on tastes. Var(ei ) is high. * Figure 10.1 Rents and Incomes for a Sample of New Yorkers * OLS and Heteroskedasticity What are the implications of heteroskedasticity for OLS? Under the Gauss–Markov assumptions (including homoskedasticity), OLS was the Best Linear Unbiased Estimator. Under heteroskedasticity, is OLS still Unbiased? Is OLS still Best? * OLS and Heteroskedasticity (cont.) A DGP with Heteroskedasticity * OLS and Heteroskedasticity (cont.) The unbiasedness conditions are the same as under the Gauss–Markov DGP. OLS is still unbiased! * OLS and Heteroskedasticity (cont.) To determine whether OLS is “Best” (i.e. the unbiased linear estimator with the lowest variance), we need to calculate the variance of a linear estimator under heteroskedasticity. * OLS and Heteroskedasticity The variance of a linear estimator is OLS minimizes OLS is no longer efficient! * OLS and Heteroskedasticity (cont.) Under heteroskedasticity, OLS is unbiased but inefficient. OLS does not have the smallest possible variance, but its variance may be acceptable. And the estimates are still unbiased. However, we do have one very serious problem: our estimated standard error formulas are wrong! * OLS and Heteroskedasticity (cont.) Implications of Heteroskedasticity: OLS is still unbiased. OLS is no longer efficient; some other linear estimator will have a lower variance. Estimated Standard Errors will be incorrect; C.I.’s and hypothesis tests (both t- and F- tests) will be incorrect. * OLS and Heteroskedasticity (cont.) Implications of Heteroskedasticity OLS is no longer efficient; some other linear estimator will have a lower variance. Can we use a better estimator? Estimated Standard Errors will be incorrect; C.I.’s and hypothesis tests (both t- and F- tests) will be incorrect. If we keep using OLS, can we calculate correct e.s.e.’s? * Tests for Heteroskedasticity Before we turn to remedies for heteroskedasticity, let us first consider tests for the complication. There are two types of tests: Tests for continuous changes in variance: White and Breusch–Pagan tests Tests for discrete (lumpy) changes in variance: the Goldfeld–Quandt test * The White Test The White test for heteroskedasticity has a basic premise: if disturbances are homoskedastic, then squared errors are on average roughly constant. Explanators should NOT be able to predict squared errors, or their proxy, squared residuals. The White test is the most general test for heteroskedasticity. * The White Test (cont.) Five Steps of the White Test: Regress Y against your various explanators using OLS Compute the OLS residuals, e1...en Regress ei2 against a constant, all of the explanators, the squares of the explanators, and all possible interactions between the explanators (p slopes total) * The White Test (cont.) Five Steps of the White Test (cont.) Compute R2 from the “auxilliary equation” in step 3 Compare nR2 to the critical value from the Chi-squared distribution with p degrees of freedom. * The White Test: Example * The White Test The White test is very general, and provides very explicit directions. The econometrician has no judgment calls to make. The White test also burns through degrees of freedom very, very rapidly. The White test is appropriate only for “large” sample sizes. * The Breusch–Pagan Test The Breusch–Pagan test is very similar to the White test. The White test specifies exactly which explanators to include in the auxilliary equation. Because the test includes cross-terms