Homework 8 - Solutions

Homework 8 - Solutions
Exercise 1
Show directly that the kernel of a linear transformation T : Rn → Rm is a subspace of Rn .
Solution
We need to show that ker T contains the zero vector and is closed under addition and scalar
multiplication. We have seen that T (0) = 0 (Homework 3, Exercise 4), so that 0 ∈ ker T . Let
~x, ~y ∈ ker T . This means T (~x) = 0 and T (~y ) = 0. Then
T (~x + ~y ) = T (~x) + T (~y ) = 0 + 0 = 0
and so ~x + ~y is in ker T . This implies that ker T is closed under addition. To show that it is closed
under scalar multiplication, let ~x ∈ ker T and c ∈ R. In particular T (~x) = 0. We have
T (c~x) = c T (~x) = c0 = 0,
and hence c~x ∈ ker T .
Exercise 2
Show directly that the image of a linear transformation T : Rn → Rm is a subspace of Rm .
Solution
We need to show that im T contains the zero vector and is closed under addition and scalar multiplication. We have seen that T (0) = 0 (Homework 3, Exercise 4), so that 0 ∈ im T . Let ~x, ~y ∈ im T .
This means ~x = T (~v ) and ~y = T (w)
~ for some ~v , w
~ ∈ Rn . Then
~x + ~y = T (~v ) + T (w)
~ = T (~v + w)
~
and so ~x + ~y is in im T . This implies that im T is closed under addition. To show that it is closed
under scalar multiplication, let ~x ∈ im T and c ∈ R. In particular ~x = T (~v ) for some ~v ∈ Rn . We
have
c~x = c T (~v ) = T (c~v )
and hence c~x ∈ im T .
Exercise 3
Let T : Rn → Rm be a linear transformation. Show that if ~v , ~u ∈ Rn are two vectors such that
T (~v ) = T (~u), then we can write ~v as the sum of ~u with some vector from the kernel of T , i.e. there
is w
~ ∈ ker T such that ~v = ~u + w.
~
Solution
Consider the element w
~ = ~v − ~u in Rn . We have
T (w)
~ = T (~v − ~u) = T (~v ) − T (~u) = 0,
since T (~v ) = T (~u), and hence w
~ is in the kernel of T . Therefore ~v = ~u + w
~ with w
~ ∈ ker T .
Exercise 4
Let ~v , w
~ ∈ Rn be two vectors. Show that they are linearly dependent if and only if one of the two
is a scalar multiple of the other. Give an example of two vectors ~v , w
~ such that ~v is a multiple of
w,
~ but w
~ is not a multiple of ~v .
Solution
Note that the first part is the special case of Exercise 7 from Homework 7 for precisely two vectors,
i.e. m = 2. Nonetheless, we show the whole proof.
First we show that “if ~v , w
~ are linearly dependent, then one of the two is a scalar multiple of
the other.” Assume that ~v , w
~ are linearly dependent, say
a~v + bw
~ =0
is a non-trivial linear relation between them. In particular one of a, b is non-zero. Assume that
a 6= 0 (the case b 6= 0 is similar). Then
b
~
a~v + bw
~ = 0 ⇒ a~v = −bw
~ ⇒ ~v = − w,
a
so that ~v is a scalar multiple of w.
~
Now we show that “if one of ~v , w
~ is a scalar multiple of the other, then they are linearly
dependent.” Assume that ~v is a scalar multiple of w
~ and write ~v = cw
~ (the case w
~ being a scalar
multiple of ~v is similar). Then
~v = cw
~ ⇒ ~v − cw
~ = 0.
This is a relation among ~v , w.
~ It is non-trivial as 1, −c are not both zero. Hence ~v , w
~ are linearly
independent.
For an example of two vectors ~v , w
~ such that ~v is a multiple of w,
~ but w
~ is not a multiple of
~v , let ~v = 0 and let w
~ be any non-zero vector in Rn . Then ~v = 0w,
~ but certainly w
~ is not a scalar
multiple of the zero vector.
Exercise 5
Let T : Rn → Rm be a linear transformation, and let ~v1 , . . . , ~vk be linearly dependent vectors in
Rn . Show that the T (~v1 ), . . . , T (~vk ) are linearly dependent vectors in Rm . Give an example to
show that if the ~v1 , . . . , ~vk are linearly independent, then the T (~v1 ), . . . , T (~vk ) need not be linearly
independent.
Solution
Since the ~v1 , . . . , ~vk are linearly dependent vectors, there are scalars c1 , . . . , ck ∈ R, not all of which
are zero, such that
c1~v1 + · · · + ck~vk = 0.
Taking the value of T on both sides of the equation, we see that
T (c1~v1 + · · · + ck~vk ) = T (c1~v1 ) + · · · + T (ck~vk ) = c1 T (~v1 ) + · · · + ck T (~vk )
is equal to T (0) = 0. Therefore there is a relation
c1 T (~v1 ) + · · · + ck T (~vk ) = 0.
It is non-trivial as not all of c1 , . . . , ck are zero, and hence T (~v1 ), . . . , T (~vk ) are linearly dependent.
2
Now assume that the ~v1 , . . . , ~vk are linearly independent vectors. Let T : Rn → Rm be the zero
linear transformation, i.e. T (~x) = 0 for all ~x ∈ Rn . Then all T (~vi ) are zero, and so the vectors
T (~v1 ), . . . , T (~vk ) are linearly dependent; for instance,
T (~v1 ) + · · · + T (~vk ) = 0
is a non-trivial relation among them.
Exercise 6
Let T : Rn → Rm be a linear transformation such that ker T = {0}, and let ~v1 , . . . , ~vk be linearly
independent vectors in Rn . Show that the T (~v1 ), . . . , T (~vk ) are linearly independent vectors in Rm .
Solution
Let c1 , . . . , ck ∈ R be scalars such that
c1 T (~v1 ) + · · · + ck T (~vk ) = 0.
Then
T (c1~v1 + · · · + ck~vk ) = T (c1~v1 ) + · · · + T (ck~vk ) = c1 T (~v1 ) + · · · + ck T (~vk ) = 0
and hence c1~v1 + · · · + ck~vk is in ker T . Since the kernel is trivial,
c1~v1 + · · · + ck~vk = 0.
This is a relation among the linearly independent vectors ~v1 , . . . , ~vk , and so it is trivial, i.e. c1 =
. . . = ck = 0. We deduce that the relation
c1 T (~v1 ) + · · · + ck T (~vk ) = 0
is also trivial, and so the vectors T (~v1 ), . . . , T (~vk ) are linearly independent.
Exercise 7
Let ~v1 , . . . , ~vm ∈ Rn be a set of orthonormal vectors, i.e. the ~vi are unit vectors such that the dot
product of any two of them is zero, that is
~vi · ~vj = 0 if i 6= j,
and ~vi · ~vi = 1.
Show that the ~v1 , . . . , ~vm are linearly independent.
Solution
Let c1 , . . . , cm ∈ R be scalars such that
c1~v1 + · · · + cm~vm = 0.
Taking the dot product of both sides of the above equation with ~vi for some i, we see that
~vi · (c1~v1 + · · · + cm~vm ) = ~vi · (c1~v1 ) + · · · + ~vi · (cm~vm ) = c1 (~vi · ~v1 ) + · · · + cm (~vi · ~vm )
= c1 · 0 + · · · + ci−1 · 0 + ci · 1 + ci+1 · 0 + · · · + cm · 0 = ci
is equal to ~vi · 0 = 0. This is true for any i, so that c1 = . . . = cm = 0. Hence any relation among
~v1 , . . . , ~vm is trivial and so they are linearly independent.
3
Exercise 8
Let V and W be two subsets of Rn (not necessarily vector subspaces). The intersection of V and
W , denoted by V ∩ W , is the subset of Rn consisting of all vectors in Rn that belong to both V
and W , i.e.
V ∩ W = {~v ∈ Rn : ~v ∈ V and ~v ∈ W } .
Show that if V, W are vector subspaces, then so is V ∩ W .
Solution
We need to show that V ∩ W contains the zero vector and is closed under addition and scalar
multiplication. The zero vector is in both V and W (as they are subspaces), and so it is in V ∩ W
too. Let ~v , w
~ ∈ V ∩ W ; we want to show that ~v + w
~ ∈ V ∩ W . Then ~v is in both V and W , and
the same holds for w.
~ Since both ~v , w
~ are in the subspace V , so is ~v + w.
~ Similarly ~v + w
~ is in W .
We see that ~v + w
~ is a vector in both V and W , and hence it is in V ∩ W . This shows that V ∩ W
is closed under addition.
Now let ~v ∈ V ∩ W and c ∈ R. We want to show c~v ∈ V ∩ W . The vector ~v is in both V and
W . Further, c~v is also in V , as V is a subspace, and in W for the same reason. From this we see
that c~v is in V ∩ W . This shows that V ∩ W is closed under scalar multiplication. We conclude
that V ∩ W is a subspace of Rn .
Exercise 9
Let V and W be two subsets of Rn (not necessarily vector subspaces). The union of V and W ,
denoted by V ∪ W , is the subset of Rn consisting of all vectors in Rn that belong to at least one of
V and W , i.e.
V ∪ W = {~v ∈ Rn : ~v ∈ V or ~v ∈ W (or both)} .
Show that if V, W are vector subspaces, V ∪ W need not be a vector subspace.
Solution
Consider the subspaces of R2 :
x
:x∈R
V = Span(~e1 ) =
0
0
:y∈R .
and W = Span(~e2 ) =
y
We claim that V ∪ W is not a subspace of R2 . The vector ~e1 is in V ∪ W , since it is in V . Similarly,
the vector ~e2 is in V ∪ W , since it is in W . Their sum ~e1 + ~e2 = [ 1 1 ]t is not in V (not of the form
[ x 0 ]t for some x ∈ R), and not in W (not of the form [ 0 y ]t for some y ∈ R), hence not in V ∪ W .
Therefore V ∪ W is not closed under addition and so not a vector space.
Exercise 10
Let V and W be two subspaces of Rn . The sum of V and W , denoted by V + W , is the subset of
Rn consisting of all vectors in Rn of the form ~v + w
~ with ~v ∈ V and w
~ ∈ W , i.e.
V + W = {~v + w
~ : ~v ∈ V, w
~ ∈ W}.
Show that V + W is a vector subspace of Rn .
Solution
We need to show that V + W contains the zero vector and is closed under addition and scalar
multiplication. The zero vector is in both V and W (as they are subspaces), and so 0 = 0 + 0 is in
4
V + W . Let ~u1 , ~u2 ∈ V + W ; we want to show that ~u1 + ~u2 ∈ V + W . Since ~u1 , ~u2 are in V + W ,
we can write them as
~u1 = ~v1 + w
~1
for some ~v1 , ~v2 ∈ V and w
~ 1, w
~ 2 ∈ W.
and ~u2 = ~v2 + w
~ 2,
Then ~v1 + ~v2 ∈ V and w
~1 + w
~ 2 ∈ W , as V, W are subspaces, so that
~u1 + ~u2 = ~v1 + w
~ 1 + ~v2 + w
~ 2 = (~v1 + ~v2 ) + (w
~1 + w
~ 2)
is in V + W . This shows that V + W is closed under addition.
Now let ~u ∈ V + W and c ∈ R. We want to show c~u ∈ V + W . Since ~u ∈ V + W , we can write
~u = ~v + w
~ for some ~v ∈ V and w
~ ∈ W . Then c~v ∈ V and cw
~ ∈ W , as V, W are subspaces, so that
c~u = c(~v + w)
~ = (c~v ) + (cw)
~
is in V + W . This shows that V ∩ W is closed under scalar multiplication. We conclude that V ∩ W
is a subspace of Rn .
Exercise 11
Let ~v1 , . . . , ~vm ∈ Rn . Show that Span(~v1 , . . . , ~vm ) is the smallest subspace of Rn containing
~v1 , . . . , ~vm . In other words, show that if V is a subspace of R3 that contains all ~v1 , . . . , ~vm , then it
contains Span(~v1 , . . . , ~vm ).
Solution
Let V be a subspace of R3 that contains all ~v1 , . . . , ~vm . Let ~v ∈ Span(~v1 , . . . , ~vm ). In particular, ~v
is a linear combination of the ~v1 , . . . , ~vm , say
~v = c1~v1 + · · · + cm~vm
for some c1 , . . . , cm ∈ R.
Each ~vi is in V , and since V is closed under scalar multiplication, every ci~vi is also in V . Furthermore, since V is closed under addition, the sum
(c1~v1 ) + · · · + (cm~vm ) = c1~v1 + · · · + cm~vm = ~v
is also in V . We see that every vector in Span(~v1 . . . , ~vm ) is also in V , and so V contains
Span(~v1 . . . , ~vm ).
Exercise 12
Let ~v1 , . . . , ~vm ∈ Rn be a set of vectors that are linearly independent, and let ~v be an element that
is not in Span(~v1 , . . . , ~vm ). Show that the ~v1 , . . . , ~vm , ~v are linearly independent.
Solution
We need to show that any relation among ~v1 , . . . , ~vm , ~v , say
c1~v1 + · · · + cm~vm + c~v = 0
for some c1 , . . . , cm , c ∈ R,
is trivial. We consider two cases: c = 0 and c 6= 0.
If c 6= 0, then we can write
c c 1
m
c1~v1 + · · · + cm~vm + c~v = 0 ⇒ −c~v = c1~v1 + · · · + cm~vm ⇒ ~v = −
~v1 + · · · + −
~vm .
c
c
5
This means that ~v is a linear combination of ~v1 , . . . , ~vm , and so an element of Span(~v1 , . . . , ~vm ).
But we know that ~v is not in Span(~v1 , . . . , ~vm ), so that the case c 6= 0 is not possible, hence c = 0.
But then the relation among ~v1 , . . . , ~vm , ~v becomes
c1~v1 + · · · + cm~vm = 0,
which is a relation among ~v1 , . . . , ~vm . Since these are linearly independent, all c1 , . . . , cm are zero.
Hence all coefficients in the original relation among ~v1 , . . . , ~vm , ~v are zero, and so the relation is
trivial. We deduce that ~v1 , . . . , ~vm , ~v are linearly independent.
Exercise 13
Let V and W be subspaces of Rn such that V is contained in W , i.e. all vectors in V are also in W .
Let {~v1 , . . . , ~vk } be a basis for V . Show that there are vectors ~vk+1 , . . . , ~vm such that the ~v1 , . . . , ~vm
form a basis for W .
Solution
If Span(~v1 , . . . , ~vk ) = W then {~v1 , . . . , ~vk } is a basis for W and we are done (in which case k = m). If
not, i.e. if Span(~v1 , . . . , ~vk ) is not all of W , then there is some ~vk+1 ∈ W not in Span(~v1 , . . . , ~vk ). The
vectors ~v1 , . . . , ~vk are linearly independent since they form a basis for V , and ~v 6∈ Span(~v1 , . . . , ~vk ), so
that by Exercise 12 the vectors ~v1 , . . . , ~vk , ~vk+1 are linearly independent. If Span(~v1 , . . . , ~vk+1 ) = W ,
then {~v1 , . . . , ~vk+1 } is a basis for W and we are done (in which case m = 1). Otherwise repeat the
same process to get ~vk+2 with ~v1 , . . . , ~vk+2 being linearly independent, and so on.
We show that this process terminates after a finite number of steps. We know that in any vector
subspace, the number of elements of any set of linearly independent vectors should be no more than
the dimension of the subspace. Therefore, there cannot be more than dim W linearly independent
vectors in W ; dim W is a number less or equal to n, as W is a subspace of Rn . When the above
process produces the element ~vk+l ∈ W such that k + l = dim W , the set {~v1 , . . . , ~vk+l } will be a
set of k + l = dim W linearly independent vectors in W , and so a basis for W .
Exercise 14
Let V and W be subspaces of Rn such that V is contained in W . Show that dim V ≤ dim W , and
that if equality holds, then V = W .
Solution
Let {~v1 , . . . , ~vk } be a basis for V , and let ~vk+1 , . . . , ~vm ∈ W be such that {~v1 , . . . , ~vm } is a basis for
W (see Exercise 13). Then dim V = k and dim W = m. Clearly k ≤ m, and hence dim V ≤ dim W .
If equality holds, i.e. if k = m, then {~v1 , . . . , ~vk } is a basis for both V and W . In this case
V = Span(~v1 , . . . , ~vk ) = W.
Exercise 15
Let V be a subspace of R3 that is not all of R3 . Assume that there are two linearly independent
vectors ~v , w
~ in V . Show that dim V = 2.
Solution
The number of elements in a set of linearly independent vectors in V cannot exceed dim V . Since
there are two linearly independent vectors in V , the dimension of V must be at least 2. The
6
dimension of V cannot be more than 3 since it is a subspace of R3 . Thus dim V is 2 or 3. If
dim V = 3, then V = R3 , which is not the case, hence dim V = 2.
Exercise 16
Let T : Rn → Rm and S : Rk → Rn be linear transformations such that the composition T ◦ S is
the zero function, i.e. (T ◦ S)(~x) = 0 for all ~x ∈ Rk . Show that the image of S is contained in the
kernel of T .
Solution
Let ~y be an element in the image of S, i.e. ~y = S(~x) for some ~x ∈ Rk . Then
T (~y ) = T (S(~x)) = (T ◦ S)(~x) = 0,
as T ◦ S is the zero function, hence ~y ∈ ker T . We deduce that im T is contained in ker T .
Exercise 17
Let A ∈ Mm×n (R) and B ∈ Mn×k (R). Show that the nullspace of AB contains the one of B. In
particular, the nullity of AB is bigger than the one of B.
Solution
Let ~x ∈ Rk be an element in the nullspace of B. Then
(AB)~x = A(B~x) = A0 = 0,
so that ~x is in the nullspace of AB. We conclude that the nullspace of AB contains the one of B.
The nullity of a matrix is the dimension of its nullspace. From Exercise 14 follows that the nullity
of AB is bigger than the one of B.
Exercise 18
Let A ∈ Mm×n (R) and B ∈ Mn×k (R). Show that the rank of AB is smaller than the rank of both
A and B.
Solution
From the Rank-Nullity Theorem, we know that
nullB + rank B = k
or equivalently
rank B = k − null B.
Similarly rank(AB) = k − null(AB). From Exercise 17 we know that null(AB) ≥ null B, so that
rank(AB) ≤ rank B.
To show rank(AB) ≤ rank A, we show that the image of A contains the image of AB. Indeed,
if ~y ∈ Rm is in im(AB), i.e. ~y = (AB)~x for some ~x ∈ Rk , then
~y = (AB)~x = A(B~x),
so that ~y is in the image of A too. Therefore im A contains im(AB). The rank of a matrix is equal
to the dimension of its image. From Exercise 14 follows that rank(AB) ≤ rank A.
7
Exercise 19
Show that the vectors ~v1 , . . . , ~vm ∈ Rn are linear independent if and only if the n × m matrix


|
|
A =  ~v1 . . . ~vm 
|
|
has rank m.
Solution
First assume that the ~v1 , . . . , ~vm are linearly independent. The image of A is Span(~v1 , . . . , ~vm ).
Since the ~v1 , . . . , ~vm are linearly independent and im A = Span(~v1 , . . . , ~vm ), the set of vectors
{~v1 , . . . , ~vm } is a basis for im A. In particular the dimension of the image of A is m. But the
dimension of the image of a matrix is equal to its rank, hence rank A = m.
Conversely, assume that A has rank m. Its image is Span(~v1 , . . . , ~vm ). The rank of A is equal
to the dimension of its image, hence dim(im A) = m. The vectors ~v1 , . . . , ~vm are m vectors that
span im A (which is of dimension m), hence they form a basis for it. In particular ~v1 , . . . , ~vm are
linearly independent.
Exercise 20
Let T : R7 → Rn and S : Rn → R5 be linear transformations. Find n, given that
dim(ker T ) = 3,
im T = ker S,
and
im S = R5 .
Solution
From the Rank-Nullity Theorem for the linear transformations T we have
dim(ker T ) + dim(imT ) = dim R7 ⇒ 3 + dim(im T ) = 7 ⇒ dim(im T ) = 4.
Since the kernel of T and the image of S are equal, their dimension are the same, i.e. dim(ker S) =
dim(im T ) = 4. From the Rank-Nullity Theorem for S we get
dim(ker S) + dim(imS) = dim Rn ⇒ 4 + 5 = n ⇒ n = 9.
8