Suppose the risk measure R is VaR(α) for some α. Let P1 and P2 be two portfolios whose returns have a joint normal distribution with means μ1 and μ2, standard deviations σ1 and σ2, and correlation ρ. Suppose the initial investments are S1 and S2. Show that R(P1 +P2) ≤ R(P1)+R(P2) under joint normality.2
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here