Multivariate linear regression fits the model
where Y is a matrix of response variables; X is a model matrix (just as in the univariate linear model); B is a matrix of regression coefficients, one column per response variable; and E is a matrix of errors. The least-squares estimator of B is Bb = (X’ X)-1X0 Y (equivalent to what one would get from separate least squares regressions of each Y on the Xs). See Section 9.5 for a discussion of the multivariate linear model.
(a) Show how Bb can be computed from the means of the variables, µYand µX, and from their covariances, SbXXand SbXY(among the Xs and between the Xs and Ys, respectively).
(b) The fitted values from the multivariate regression are Y = XB. It follows that the fitted values Yijand Ybij0 for the ith observation on response variables j and j 0 are both linear combinations of the ith row of the model matrix, x’i. Use this fact to find an expression for the covariance of Yijand Ybij0 .
(c) Show how this result can be used in Equation 20.7 (on page 618), which applies the EM algorithm to multivariate-normal data with missing values.
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here