Some badly conditioned problems can be improved dramatically by simply scaling the variables so that the parameters of the problem do not differ by orders of magnitude. For a Gauss-Markov model, E(y)...


Some badly conditioned problems can be improved dramatically by simply scaling the variables so that the parameters of the problem do not differ by orders of magnitude. For a Gauss-Markov model, E(y) = Xb, Cov(y) = σ2I, this may mean using a scaled response y∗ = cy where c is some scalar, perhaps c = 1000 from changing the units of the response from kilograms to grams. We could also scale the covariates and use the design matrix X∗ = XD where D is a diagonal matrix. Now we can rewrite the Gauss-Markov model in terms of the rescaled response and design matrix, E(y∗) = X∗b∗, Cov(y∗) = σ2 ∗ I. a) Write the new parameters in terms of the old, that is, find S(b) and T (σ2) so that b∗ = S(b) and σ2 ∗ = T (σ2). b) Do the usual least squares estimators give the correct adjustment, that is, are bˆ ∗ = S(bˆ) and σˆ ∗ ∗ = T (σˆ 2 ∗ )? c) For the following problems, examine whether the estimators/algorithm responds properly to changes in scale. Consider whether both c and D are appropriate, and also (if appropriate) estimating σ2. i) GLS arising from Cov(y) = σ2V with V known ii) Ridge regression, with b˜ = (XT X+kI)−1XT y ii) Newton/Scoring for logistic regression.



May 03, 2022
SOLUTION.PDF

Get Answer To This Question

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here