We used the `mtcars` data in the lecture notes, and also introduced the $k$-fold cross-validation. For this question you need to complete the following: * Write a $5$-fold cross-validation code by...

2 answer below »
We used the `mtcars` data in the lecture notes, and also introduced the $k$-fold cross-validation. For this question you need to complete the following:

* Write a $5$-fold cross-validation code by yourself, using the `lm.ridge()` function to fit the model and predict on the testing data. Choose an appropriate range of lambda values based on how this function specifies the penalty. Obtain the cross-validation error corresponding to each $\lambda$ and produce an intuitive plot to how it changes over different $\lambda$. What is the best penalty level you obtained from this procedure? Compare that with the GCV result. Please note that you should clearly state the intention of each step of your code and state your result. For details regrading writing a report, please watch the `Comment Video on HW` from week 1 webpage, or the discussion broad. * Use the `cv.glmnet()` function from the `glmnet` package to perform a $5$-fold cross-validation using their built-in feature. Produce the cross-validation error plot against $\lambda$ values. Report the `lambda.min` and `lambda.1se` selected $\lambda$ value.



Ridge Regression and the Bias-variance Trade-off 9/21/21, 6:13 PMRidge Regression and the Bias-variance Trade-off Page 1 of 18https://teazrq.github.io/stat432/RNotes/Ridge/RidgeReg.html#Bias_and_Variance_of_Ridge_Regression Motivation: Correlated Variables and Convexity Bias and Variance of Ridge Regression Using the lm.ridge() function -fold cross- validation k Generalized cross- validation The glmnet package Scaling Issues of Ridge Regression Ridge Regression Ridge Regression and the Bias-variance Trade- off Ruoqing Zhu Last Updated: September 20, 2021 Ridge Regression Ridge regression was proposed by Hoerl and Kennard (1970), but is also a special case of the Tikhonov regularization (https://en.wikipedia.org/wiki/Tikhonov_regularization). The essential idea is very simple: Knowing that the ordinary least squares (OLS) solution is not unique in an ill-posed problem, i.e., is not invertible, a ridge regression adds a ridge (diagonal matrix) on : It provides a solution of linear regression when multicollinearity happens, especially when the number of variables is larger than the sample size. Alternatively, this is also the solution of a regularized least square estimator. We add an penalty to the residual sum of squares, i.e., for some penalty . Another approach that leads to the ridge regression is a constraint on the norm of the parameters, which will be introduced in the next week. Ridge regression is used extensively in genetic analyses to address “small- -large- ” problems. We will start with a motivation example and then discuss the crucial topic this week: the bias-variance trade-off. Code XXT XXT = ( X + nλI y,β̂ridge XT )−1XT ℓ2 =β̂ ridge = ‖y − Xβ + nλ‖βarg min β ‖22 ‖22 ( − β + λ ,arg min β 1 n ∑i=1 n yi xTi )2 ∑ j=1 p β2j λ > 0 ℓ2 n p https://en.wikipedia.org/wiki/Tikhonov_regularization 9/21/21, 6:13 PMRidge Regression and the Bias-variance Trade-off Page 2 of 18https://teazrq.github.io/stat432/RNotes/Ridge/RidgeReg.html#Bias_and_Variance_of_Ridge_Regression Motivation: Correlated Variables and Convexity Ridge regression has many advantages. Most notably, it can address highly correlated variables. From an optimization point of view, having highly correlated variables means that the objective function ( loss) becomes “flat” along certain directions in the parameter domain. This can be seen from the following example, where the true parameters are both while the estimated parameters concludes almost all effects to the first variable. You can change different seed to observe the variability of these parameter estimates and notice that they are quite large. Instead, if we fit a ridge regression, the parameter estimates are relatively stable. library(MASS) set.seed(2) n = 30 # create highly correlated variables and a linear model X = mvrnorm(n, c(0, 0), matrix(c(1,0.99, 0.99, 1), 2,2) ) y = rnorm(n, mean = X[,1] + X[,2]) # compare parameter estimates summary(lm(y~X-1))$coef ## Estimate Std. Error t value Pr(>|t|) ## X1 1.8461255 1.294541 1.42608527 0.1648987 ## X2 0.0990278 1.321283 0.07494822 0.9407888 # note that the true parameters are all 1's # Be careful that the `lambda` parameter in lm.ridge is our (n*lambda) lm.ridge(y~X-1, lambda=5) ## X1 X2 ## 0.9413221 0.8693253 The variance of both and are quite large. This is expected because we know from linear regression that the variance of is . However, since the columns of are highly correlated, the smallest eigenvalue of is close to 0, making the largest eigenvalue of very large. This can also be interpreted through an optimization point of view. The objective function for an OLS estimator is demonstrated in the following. ℓ2 1 Hide β1 β2 β̂ ( Xσ2 XT )−1 X XXT ( XXT )−1 9/21/21, 6:13 PMRidge Regression and the Bias-variance Trade-off Page 3 of 18https://teazrq.github.io/stat432/RNotes/Ridge/RidgeReg.html#Bias_and_Variance_of_Ridge_Regression beta1 <- seq(0,="" 3,="" 0.005)="" beta2=""><- seq(-1,="" 2,="" 0.005)="" allbeta=""><- data.matrix(expand.grid(beta1,="" beta2))="" rss=""><- matrix(apply(allbeta,="" 1,="" function(b,="" x,="" y)="" sum((="" y="" -="" x="" %*%="" b)^2),="" x,="" y),="" length(beta1),="" length(beta2))="" #="" quantile="" levels="" for="" drawing="" contour="" quanlvl="c(0.01," 0.025,="" 0.05,="" 0.2,="" 0.5,="" 0.75)="" #="" plot="" the="" contour="" contour(beta1,="" beta2,="" rss,="" levels="quantile(rss," quanl="" vl))="" box()="" #="" the="" truth="" points(1,="" 1,="" pch="19," col="red" ,="" cex="2)" #="" the="" data="" betahat=""><- coef(lm(y~x-1))="" points(betahat[1],="" betahat[2],="" pch="19," col="blue" ,="" cex="2)" as="" an="" alternative,="" if="" we="" add="" a="" ridge="" regression="" penalty,="" the="" contour="" is="" forced="" to="" be="" more="" convex="" due="" to="" the="" added="" eigenvalues.="" here="" is="" a="" plot="" of="" the="" ridge="" penalty.="" hide="" ℓ2="" 9/21/21,="" 6:13="" pmridge="" regression="" and="" the="" bias-variance="" trade-off="" page="" 4="" of="" 18https://teazrq.github.io/stat432/rnotes/ridge/ridgereg.html#bias_and_variance_of_ridge_regression="" hence,="" by="" adding="" this="" to="" the="" ols="" objective="" function,="" the="" solution="" is="" more="" stable,="" in="" the="" sense="" that="" each="" time="" we="" observe="" a="" new="" set="" of="" data,="" this="" contour="" looks="" pretty="" much="" the="" same.="" this="" may="" be="" interpreted="" in="" several="" different="" ways="" such="" as:="" 1)="" the="" objective="" function="" is="" more="" convex="" and="" less="" affected="" by="" the="" random="" samples;="" 2)="" the="" variance="" of="" the="" estimator="" is="" smaller="" because="" the="" eigenvalues="" of="" are="" large.xxt="" hide="" 9/21/21,="" 6:13="" pmridge="" regression="" and="" the="" bias-variance="" trade-off="" page="" 5="" of="" 18https://teazrq.github.io/stat432/rnotes/ridge/ridgereg.html#bias_and_variance_of_ridge_regression="" par(mfrow="c(1," 2))="" #="" adding="" a="" l2="" penalty="" to="" the="" objective="" function="" rss=""><- matrix(apply(allbeta,="" 1,="" function(b,="" x,="" y)="" sum="" ((y="" -="" x="" %*%="" b)^2)="" +="" b="" %*%="" b,="" x,="" y),="" length(beta1),="" length(beta2))="" #="" the="" ridge="" solution="" bh="solve(t(X)" %*%="" x="" +="" diag(2))="" %*%="" t(x)="" %*%="" y="" contour(beta1,="" beta2,="" rss,="" levels="quantile(rss," qua="" nlvl))="" points(1,="" 1,="" pch="19," col="red" ,="" cex="2)" points(bh[1],="" bh[2],="" pch="19," col="blue" ,="" cex="2)" box()="" #="" adding="" a="" larger="" penalty="" rss=""><- matrix(apply(allbeta,="" 1,="" function(b,="" x,="" y)="" sum="" ((y="" -="" x="" %*%="" b)^2)="" +="" 10*b="" %*%="" b,="" x,="" y),="" length(beta1),="" length(beta2))="" bh="solve(t(X)" %*%="" x="" +="" 10*diag(2))="" %*%="" t(x)="" %*%="" y="" #="" the="" ridge="" solution="" contour(beta1,="" beta2,="" rss,="" levels="quantile(rss," qua="" nlvl))="" points(1,="" 1,="" pch="19," col="red" ,="" cex="2)" points(bh[1],="" bh[2],="" pch="19," col="blue" ,="" cex="2)" box()="" however,="" this="" causes="" some="" bias="" too.="" choosing="" the="" tuning="" parameter="" is="" a="" balance="" of="" the="" bias-variance="" trade-off,="" which="" will="" be="" discussed="" in="" the="" following.="" 9/21/21,="" 6:13="" pmridge="" regression="" and="" the="" bias-variance="" trade-off="" page="" 6="" of="" 18https://teazrq.github.io/stat432/rnotes/ridge/ridgereg.html#bias_and_variance_of_ridge_regression="" bias="" and="" variance="" of="" ridge="" regression="" we="" have="" not="" formally="" discussed="" the="" bias,="" but="" one="" of="" the="" quantity="" to="" best="" illustrate="" this="" is="" the="" variance="" estimation="" (in="" hw1).="" we="" know="" that="" an="" unbiased="" estimation="" of="" is="" ,="" while="" an="" mle="" estimator="" is="" biased.="" there="" is="" of="" course="" derivation="" of="" this="" commonly="" known="" fact,="" but="" let’s="" introduce="" the="" concept="" of="" a="" simulation="" study,="" which="" we="" can="" use="" in="" the="" future.="" consider="" many="" researchers="" and="" each="" of="" them="" samples="" observations="" to="" estimate="" ,="" and="" they="" are="" all="" using="" the="" biased="" formula.="" on="" average,="" what="" would="" happen?="" consider="" the="" following="" code.="" i="" am="" writing="" it="" in="" the="" most="" naive="" way="" without="" considering="" any="" computational="" efficiency.="" σ2="" (="" −1n−1="" ∑="" n="" i="1" xi="" x̄)2="" (="" −1n="" ∑="" n="" i="1" xi="" x̄)2="" n="3" σ2="" hide="" 9/21/21,="" 6:13="" pmridge="" regression="" and="" the="" bias-variance="" trade-off="" page="" 7="" of="" 18https://teazrq.github.io/stat432/rnotes/ridge/ridgereg.html#bias_and_variance_of_ridge_regression="" set.seed(1)="" #="" number="" of="" researchers="" nsim="1000" #="" number="" of="" observations="" n="3" #="" define="" a="" function="" to="" calculate="" the="" biased="" variance="" es="" timation="" biasedsigma2=""><- function(x)="" sum((x="" -="" mean(x))^2)/length="" (x)="" #="" define="" a="" function="" to="" calculate="" the="" unbiased="" variance="" estimation="" unbiasedsigma2=""><- function(x) sum((x - mean(x))^2)/(len gth(x) - 1) # save all estimated variance in a vector allbiased = rep(na, nsim) allunbiased = rep(na, nsim) # generate 3 observations for each researcher and # record their biased estimation for (i in 1:nsim) { datai = rnorm(n) allbiased[i] = biasedsigma2(datai) allunbiased[i] = unbiasedsigma2(datai) } # the averaged of all of them mean(allbiased) ## [1] 0.7113691 mean(allunbiased) ## [1] 1.067054 on average, the researchers using the biased estimation estimate the to be 0.7114. since the true variance is 1, this is obviously biased. with the same data, the unbiased estimation is 1.0670. we do need to consider that the variation across different researchers is quite large, however, we know from the theory that our conclusion should be correct. now, let’s apply the same analysis on the ridge regression estimator. for the theoretical justification of this analysis, please read the smlr textbook (https://teazrq.github.io/smlr/ridge-regression.html#bias-and-variance- σ2 https://teazrq.github.io/smlr/ridge-regression.html#bias-and-variance-of-ridge-regression 9/21/21, 6:13 pmridge regression and the bias-variance trade-off page 8 of 18https://teazrq.github.io/stat432/rnotes/ridge/ridgereg.html#bias_and_variance_of_ridge_regression of-ridge-regression). we will set up a simulation study with the following steps, with both and : 1. generate a set of observations 2. estimate the ridge estimator with 3. repeat steps 1) and 2) runs 4. average all estimations and compare that with the truth 5. compute the variation of these estimates across all runs β̂ ridge β̂ ols n = 100 β̂ ridge λ = 0.3 nsim = 1000 β hide https://teazrq.github.io/smlr/ridge-regression.html#bias-and-variance-of-ridge-regression 9/21/21 function(x)="" sum((x="" -="" mean(x))^2)/(len="" gth(x)="" -="" 1)="" #="" save="" all="" estimated="" variance="" in="" a="" vector="" allbiased="rep(NA," nsim)="" allunbiased="rep(NA," nsim)="" #="" generate="" 3="" observations="" for="" each="" researcher="" and="" #="" record="" their="" biased="" estimation="" for="" (i="" in="" 1:nsim)="" {="" datai="rnorm(n)" allbiased[i]="biasedsigma2(datai)" allunbiased[i]="unbiasedsigma2(datai)" }="" #="" the="" averaged="" of="" all="" of="" them="" mean(allbiased)="" ##="" [1]="" 0.7113691="" mean(allunbiased)="" ##="" [1]="" 1.067054="" on="" average,="" the="" researchers="" using="" the="" biased="" estimation="" estimate="" the="" to="" be="" 0.7114.="" since="" the="" true="" variance="" is="" 1,="" this="" is="" obviously="" biased.="" with="" the="" same="" data,="" the="" unbiased="" estimation="" is="" 1.0670.="" we="" do="" need="" to="" consider="" that="" the="" variation="" across="" different="" researchers="" is="" quite="" large,="" however,="" we="" know="" from="" the="" theory="" that="" our="" conclusion="" should="" be="" correct.="" now,="" let’s="" apply="" the="" same="" analysis="" on="" the="" ridge="" regression="" estimator.="" for="" the="" theoretical="" justification="" of="" this="" analysis,="" please="" read="" the="" smlr="" textbook="" (https://teazrq.github.io/smlr/ridge-regression.html#bias-and-variance-="" σ2="" https://teazrq.github.io/smlr/ridge-regression.html#bias-and-variance-of-ridge-regression="" 9/21/21,="" 6:13="" pmridge="" regression="" and="" the="" bias-variance="" trade-off="" page="" 8="" of="" 18https://teazrq.github.io/stat432/rnotes/ridge/ridgereg.html#bias_and_variance_of_ridge_regression="" of-ridge-regression).="" we="" will="" set="" up="" a="" simulation="" study="" with="" the="" following="" steps,="" with="" both="" and="" :="" 1.="" generate="" a="" set="" of="" observations="" 2.="" estimate="" the="" ridge="" estimator="" with="" 3.="" repeat="" steps="" 1)="" and="" 2)="" runs="" 4.="" average="" all="" estimations="" and="" compare="" that="" with="" the="" truth="" 5.="" compute="" the="" variation="" of="" these="" estimates="" across="" all="" runs="" β̂="" ridge="" β̂="" ols="" n="100" β̂="" ridge="" λ="0.3" nsim="1000" β="" hide="" https://teazrq.github.io/smlr/ridge-regression.html#bias-and-variance-of-ridge-regression="">
Answered 4 days AfterSep 21, 2021

Answer To: We used the `mtcars` data in the lecture notes, and also introduced the $k$-fold cross-validation....

Robert answered on Sep 26 2021
139 Votes
> library(MASS)
> set.seed(2)
> nsim=1000
> n=100
> lambda=0.9
> allridgebeta=matrix(NA,nsim,2)
> for(i in lambda){
+ for(i in 1:nsim)
+ {
+ x=mvrnorm(n,c(0,0),matrix(c(1,0.99,0.99,1),2,2))
+ y=rnorm(n,mean = x[,1]+x[,2])
+ allridgebeta[i,]=solve(t(x)%*%x+lambda*n*diag(2))%*%t(x)%*%y }
+ }
> allridge
Error: object 'allridge' not found
> allridgebeta
[,1] [,2]
[1,] 0.7389524 0.7294337
[2,] 0.7077690 0.7233129
[3,] 0.6920325 0.6835985
[4,] 0.6486345 0.6530556
[5,] 0.6794410 0.6695640
[6,] 0.7051073 0.7004656
[7,] 0.6940947 0.6474362
[8,] 0.7269877 0.7021716
[9,] 0.7116283 0.7273537
[10,] 0.6897933 0.6716877
[11,] 0.6642712 0.6723500
[12,] 0.6862648 0.6720921
[13,] 0.7197306 0.7300141
[14,] 0.7126882 0.6773983
[15,] 0.6848574 0.6933491
[16,] 0.6854437 0.6712318
[17,] 0.6267624
0.6454580
[18,] 0.6877776 0.6793959
[19,] 0.7157528 0.7414439
[20,] 0.6671051 0.6796411
[21,] 0.6557646 0.6652206
[22,] 0.7185531 0.7357587
[23,] 0.7082842 0.7153055
[24,] 0.6580081 0.6378027
[25,] 0.6286690 0.6388322
[26,] 0.6763657 0.6951301
[27,] 0.6732662 0.6609507
[28,] 0.7442778 0.7269747
[29,] 0.6870474 0.6822823
[30,] 0.7414386 0.7347276
[31,] 0.6947309 0.6867296
[32,] 0.7104227 0.6828393
[33,] 0.6826539 0.6954103
[34,] 0.5895557 0.5910012
[35,] 0.7161227 0.6988593
[36,] 0.6784354 0.6672047
[37,] 0.7262887 0.7170657
[38,] 0.6542274 0.6504329
[39,] 0.7941030 0.7847422
[40,] 0.6808163 0.6755186
[41,] 0.6930936 0.6990251
[42,] 0.6712715 0.6501668
[43,] 0.7232006 0.7275698
[44,] 0.6615936 0.6561905
[45,] 0.7662802 0.7457218
[46,] 0.7237270 0.7206266
[47,] 0.6390294 0.6383227
[48,] 0.6723270 0.6843780
[49,] 0.6918943 0.6733999
[50,] 0.6883160 0.6872397
[51,] 0.6871003 0.7055573
[52,] 0.6650362 0.6381830
[53,] 0.6466315 0.6375580
[54,] 0.7226079 0.7464210
[55,] 0.7489416 0.7291950
[56,] 0.6923512 0.7177470
[57,] 0.6736165 0.7285528
[58,] 0.6758865 0.7152195
[59,] 0.7259267 0.7142723
[60,] 0.7230352 0.6754965
[61,] 0.6501280 0.6725631
[62,] 0.6490608 0.6506161
[63,] 0.6818176 0.6847639
[64,] 0.6206999 0.6598768
[65,] 0.7193833 0.7461775
[66,] 0.6707291 0.6449064
[67,] 0.7008421 0.6935624
[68,] 0.6683640 0.6861007
[69,] 0.6659492 0.6845213
[70,] 0.6688520 0.6794082
[71,] 0.6491853 0.6463675
[72,] 0.7159586 0.7042456
[73,] 0.7348803 0.7479636
[74,] 0.6562752 0.6874757
[75,] 0.7176799 0.7120179
[76,] 0.6612925 0.6806900
[77,] 0.7063993 0.7124529
[78,] 0.6962405 0.6870503
[79,] 0.7490764 0.7161758
[80,] 0.6636010 0.6763998
[81,] 0.6946656 0.6685110
[82,] 0.6838478 0.6845294
[83,] 0.6795334 0.6790051
[84,] 0.7330894 0.7293710
[85,] 0.6766380 0.6919856
[86,] 0.7642594 0.7324315
[87,] 0.7592950 0.7538307
[88,] 0.7060697 0.7220847
[89,] 0.6811913 0.7231565
[90,] 0.6624037 0.6587553
[91,] 0.6537901 0.6610349
[92,] 0.7147342 0.7257448
[93,] 0.7226490 0.7437074
[94,] 0.7043543 0.7068494
[95,] 0.6567092 0.6718592
[96,] 0.6620378 0.6642713
[97,] 0.6658794 0.6674918
[98,] 0.6144146 0.6305772
[99,] 0.7134501 0.7020546
[100,] 0.6445555 0.6619028
[101,] 0.6784511 0.6671402
[102,] 0.6832418 0.6891065
[103,] 0.6997761 0.7039386
[104,] 0.6930114 0.6972309
[105,] 0.6710235 0.6676713
[106,] 0.6860930 0.6902883
[107,] 0.6770385 0.6497607
[108,] 0.6914006 0.6900819
[109,] 0.8145281 0.8169119
[110,] 0.7114872 0.7163694
[111,] 0.6614077 0.6437184
[112,] 0.6489671 0.6632859
[113,] 0.6519089 0.6666262
[114,] 0.6788132 0.6838106
[115,] 0.7299828 0.7393066
[116,] 0.6842492 0.7025505
[117,] 0.7074228 0.7195030
[118,] 0.6872104 0.6720514
[119,] 0.7477430 0.7317494
[120,] 0.6279157 0.6314735
[121,] 0.7204059 0.6779059
[122,] 0.7694764 0.7518159
[123,] 0.7124437 0.7277123
[124,] 0.6898084 0.6887544
[125,] 0.6351810 0.6531063
[126,] 0.6856369 0.6847211
[127,] 0.7075331 0.6832022
[128,] 0.6567508 0.6660169
[129,] 0.5919217 0.5959365
[130,] 0.7364259 0.6997260
[131,] 0.6658973 0.6534935
[132,] 0.6123640 0.6149497
[133,] 0.6946630 0.7047923
[134,] 0.6869406 0.6481854
[135,] 0.6444343 0.6322343
[136,] 0.6668394 0.6849088
[137,] 0.7519278 0.7257369
[138,] 0.6743340 0.6666009
[139,] 0.7933577 0.8009690
[140,] 0.7058471 0.7394899
[141,] 0.7462868 0.7253455
[142,] 0.6464666 0.6290356
[143,] 0.7212719 0.7042940
[144,] 0.7292913 0.7335335
[145,] 0.6180142 0.6465725
[146,] 0.6303513 0.6397933
[147,] 0.6197922 0.5860371
[148,] 0.6397047 0.6553883
[149,] 0.6827134 0.6830068
[150,] 0.6417920 0.6401584
[151,] 0.6342336 0.6155749
[152,] 0.6692087 0.6646812
[153,] 0.6198626 0.6231652
[154,] 0.7125273 0.7025529
[155,] 0.6858071 0.6834789
[156,] 0.6181276 0.6278842
[157,] 0.6509586 0.6472210
[158,] 0.6385507 0.6586909
[159,] 0.7087213 0.7095249
[160,] 0.6836741 0.6845208
[161,] 0.7516262 0.7658949
[162,] 0.6949414 0.7321509
[163,] 0.6563652 0.6357689
[164,] 0.6457708 0.6677397
[165,] 0.7304504 0.7024972
[166,] 0.6980177 0.7031083
[167,] 0.6563077 0.6674934
[168,] 0.6833228 0.6702808
[169,] 0.6774816 0.6768391
[170,] 0.6081640 0.6227935
[171,] 0.6583614 0.6682111
[172,] 0.7108753 0.7286042
[173,] 0.6851565 0.6852504
[174,] 0.7181315 0.7277144
[175,] 0.6301477 0.6365511
[176,] 0.6478784 0.6421675
[177,] 0.6814620 0.6793399
[178,] 0.7017001 0.7004576
[179,] 0.7199283 0.7192826
[180,] 0.7312182 0.7300593
[181,] 0.7112028 0.7386449
[182,] 0.6532713 0.6353015
[183,] 0.6526045 0.6638799
[184,] 0.6537434 0.6335062
[185,] 0.6524831 0.6622835
[186,] 0.7471322 0.7316105
[187,] 0.6473737 0.6419769
[188,] 0.6533374 0.6593799
[189,] 0.7280034 0.7106427
[190,] 0.7120298 0.7066254
[191,] 0.6606219 0.6087123
[192,] 0.6873505 0.6928968
[193,] 0.6071935 0.6046323
[194,] 0.6512521 0.6574918
[195,] 0.7463697 0.7457380
[196,] 0.6949842 0.7091577
[197,] 0.6886831 0.6710035
[198,] 0.6385254 0.6252832
[199,] 0.7001493 0.7060116
[200,] 0.7419171 0.7261673
[201,] 0.6528970 0.6813738
[202,] 0.6564823 0.6632605
[203,] 0.6278699 0.6614065
[204,] 0.6237987 0.6024441
[205,] 0.6336605 0.6055392
[206,] 0.6876407 0.6808091
[207,] 0.6300581 0.6572785
[208,] 0.7312763 0.7150187
[209,] 0.6775512 0.6739138
[210,] 0.6988106 0.7075290
[211,] 0.6250965 0.6098240
[212,] 0.6979451 0.7299821
[213,] 0.6461386 0.6377946
[214,] 0.5865553 0.5851633
[215,] 0.7409691 0.7425174
[216,] 0.6799459 0.6773195
[217,] 0.5389710 0.5697125
[218,] 0.7307889 0.7172438
[219,] 0.7018655 0.6785799
[220,] 0.6279856 0.6362603
[221,] 0.7053955 0.6811361
[222,] 0.6919720 0.7283655
[223,] 0.6477783 0.6779867
[224,] 0.6617265 0.6862306
[225,] 0.7206971 0.7093053
[226,] 0.7219330 0.7237765
[227,] 0.6375829 0.6553047
[228,] 0.7062510 0.6889642
[229,] 0.6660188 0.6753649
[230,] 0.7174401 0.7167230
[231,] 0.7187021 0.7132055
[232,] 0.7593531 0.7314633
[233,] 0.7199457 0.6879721
[234,] 0.6788201 0.6625146
[235,] 0.6029288 0.5856765
[236,] 0.6868495 0.6991774
[237,] 0.6936096 0.6910464
[238,] 0.7713016 0.7932740
[239,] 0.7447803 0.7347986
[240,] 0.6840325 0.6909074
[241,] 0.7387234 0.7424752
[242,] 0.7261938 0.7285364
[243,] 0.6612881 0.6803125
[244,] 0.6188530 0.6169640
[245,] 0.6171705 0.6467993
[246,] 0.7199133 0.7381158
[247,] 0.7077606 0.7072651
[248,] 0.6694994 0.6799761
[249,] 0.6812093 0.6726553
[250,] 0.7283383 0.7298808
[251,] 0.6669434 0.6689352
[252,] 0.7188229 0.7112304
[253,] 0.6495821 0.6662271
[254,] 0.6887372 0.6962643
[255,] 0.6919647 0.7060185
[256,] 0.7650472 0.7499832
[257,] 0.6821010 0.7054979
[258,] 0.6933143 0.7023347
[259,] 0.7056755 0.7337125
[260,] 0.7311842 0.7293319
[261,] 0.6438946 0.6536721
[262,] 0.7592535 0.7544752
[263,] 0.6610529 0.6443233
[264,] 0.5631444 0.5800831
[265,] 0.7349365 0.7402307
[266,] 0.6453862 0.6503205
[267,] 0.5695209 0.5779737
[268,] 0.6642748 0.6468120
[269,] 0.6256396 0.6264312
[270,] 0.7143611 0.7006545
[271,] 0.6340024 0.6319773
[272,] 0.6520180 0.6755251
[273,] 0.7278351 0.6945972
[274,] 0.6890067 0.6765668
[275,] 0.6410469 0.6391050
[276,] 0.7286407 0.7102486
[277,] 0.6637165 0.6149366
[278,] 0.7312348 0.7239802
[279,] 0.6090046 0.6016778
[280,] 0.7369604 0.7593658
[281,] 0.7042370 0.6957684
[282,] 0.6890935 0.7060871
[283,] 0.6762016 0.7078844
[284,] 0.7111281 0.6977876
[285,] 0.7009408 0.6804474
[286,] 0.6656447 0.6952991
[287,] 0.6851282 0.6991748
[288,] 0.6468834 0.6440928
[289,] 0.7008427 0.7022022
[290,] 0.6325470 0.6444935
[291,] 0.7837527 0.7806910
[292,] 0.7032622 0.7075240
[293,] 0.6942236 0.7272430
[294,] 0.6624292 0.6768260
[295,] 0.6511714 0.6496781
[296,] 0.6631962...
SOLUTION.PDF

Answer To This Question Is Available To Download

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here