Let A be a 2 times 2 matrix and V1 and V2 vectors in R2 Prov
Let A be a 2 times 2 matrix and V_1 and V_2 vectors in R^2. Prove that A(alpha V_1 + beta V_2) = alpha AV_1 + beta V_2.
Solution
we know that marrices follow the distributive law that says
A(b+c)= Ab + Ac
now if alpha and beta are some real scalars that is {alpha,beta E R - {0}}
and if we have tow vecotors V1 and V2 in R2 that is
V1 =x1 i +y1 j
and V2 = x2 i + y2 j
and if A is a 2X2 matrix then
we could use the distributive law would hold true on the expression
A(alpha*V1 + beta*V2) , --------------->(1)
hence (1) could be writen as alpha*AV1 + beta*AV2
hence A(alpha*V1 + beta*V2) = alpha*AV1 + beta*AV2
THis is possible as both the matrix and the vectors are in 2 dimensional space coordinate R2
