Suppose the risk measure R is VaR(α) for some α. Let P1 and P2 be two portfolios whose returns have a joint normal distribution with means μ1 and μ2, standard deviations σ1 and σ2, and correlation ρ....


Suppose the risk measure R is VaR(α) for some α. Let P1 and P2 be two portfolios whose returns have a joint normal distribution with means μ1 and μ2, standard deviations σ1 and σ2, and correlation ρ. Suppose the initial investments are S1 and S2. Show that R(P1 +P2) ≤ R(P1)+R(P2) under joint normality.2



May 26, 2022
SOLUTION.PDF

Get Answer To This Question

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here