a Prove that if v1 vk is a linearly dependent list of ve
a) Prove that if v1, . . . , vk is a linearly dependent list of vectors in R n , and T: R n R m is a linear transformation, then T(v1), . . . , T(vk) is also linearly dependent.
b) Prove that if S: R n R m and T: R m R p are linear transformations, then the composite transformation T S (defined by T S(x) = T(S(x))) is also a linear transformation. Prove that if T S is one-to-one, then S is one-to-one as well. Prove that if T and S are both one-to-one, then T S is one-to-one.
c) Prove that if B is an m × n matrix with linearly independent columns and A is a p × m matrix with linearly independent columns, then the columns of AB are also linearly independent. Prove that if B is m × n, A is p × m, and AB has linearly independent columns, then the columns of B are also linearly independent. How is this related to part b)?
Solution
Given that v1,v2,v3...vn are linearly dependent.
Hence it is possible to represent atleast one vi say vm as a linear combination of other n-1 vector
vm = av1+bv2 +cv3+......pvm-1+qvm+1+...
Tranform Vm now
T(vm) = T(av1+bv2 +cv3+......pvm-1+qvm+1+...)
= aT(v1)+bT(v2)+cT(v3)+....+pT(vm-1)+qT(vm+1)+...) (due to linearlity of T)
i.e. it directly follows that T(vm) can be represented as a linear combination of other n-1 vectors of vi.
So they are linearly dependent.
------------------------------------------------------------------
Given that S: R n R m and T: R m R p are linear transformations.
T S (x+y) = T{S(x)+S(y)} since S is linear
= T(s1+s2)
= T(s1)+T(s2) (since T is linear)
= T(S(x))+T(S(y))
= T0S (x) + T0S (y)
Follows that ToS is also linear
-------
Let T0S (x) be one to one
Then T0S (x)=T0S (y) gives x =y
T{S(x)} = T{S(y}} implies x =y
Or S(x) = S(y) implies x=y hence S is also one to one.
--------------------------------------
If T and S both are one to one
then ToS(x) = ToS(y)
implies
ToS(x-y) =0
x=y hence one to one.
---------------
A is a p × m matrix with linearly independent columns
B is a m × n matrix with linearly independent columns
The matrix multiplication of AB is nothing but multiplying each row of matrix A with each column of matrix B.
Since B is columnwise linearly independent, the matrix AB got columns as a linear combination of linearly independent columns of B.
So it follows that Columns of AB are linearly independent.
----------------------------------------------------------
Conversely if AB is linearly independent, AB can be said to have columns as
ToS where S is for columns of B and T for rows of A
Since by part B, it follows that when ToS is one to one, T is one to one.
Here we have unique combination of linear combinations B also has.
Hence B is columnwise linearly independent.

