A financial investment 1 has average returns of 100 and a st
A financial investment 1 has average returns of $100 and a standard deviation of $20. Investment 2 has average returns of $100 and a standard deviation of $40. The correlation coefficient between the twon investments equals -1/2. If you split $1 between both investment, how should you distribute the amounts to minimize the variance of the returns?
Solution
suppose we invest x in investment 1 and 1-x in investment 2..
total variance = 400*x^2 + 1600*(1-x)^2 + 2*(20*40*x*(1-x)* ( -.5) )
 
 d(total variacne) / dx = 5600x - 4000 = 0..so, x = 5/7..
2nd order derivative = 5600 > 0.
 so, it is the minimum variacne..
 so, x= 5/7 and (1-x) = 2/7..
 He invests (5/7)$ in investment1 and (2/7) in investment2....

