Homework 7 - Solutions

Homework 7 - Solutions
Exercise 1 (1 pt.)
Consider the transformation T : R4 → R4 given by

−1 1
 0 −2
A=
 1
1
2
2
the matrix

3 −3
0
4 
.
−3 −1 
−6 −2
Find bases for the kernel and the image of T .
Solution
First we calculate the reduced row echelon form of A.



−1 1
3 −3
1

 0 −2 0

R
→R
/(−1)
4  1 1
0

−−−−−−−−→ 
 1
1 −3 −1  R2 →R2 /(−2)  1
2
2 −6 −2
2



1 −1 −3 3
1
R1 →R1 +R2
 0 1
 R3 →R3 −2R2  0
0
−2

 −−−−−−−−−−→ 
 0 2
0 −4  R4 →R4 −4R2  0
0 4
0 −8
0
We have

−1 −3 3
1
0 −2 
→R3 −R1
 −R
−3−−−−
−−→ · · ·
1 −3 −1  R4 →R4 −2R1
2 −6 −2

0 −3 1
1 0 −2 

0 0
0 
0 0
0
The kernel of T consists of the vectors ~x = [ x1 x2 x3 x4 ]t that are solutions to the system A~x = 0.
The augmented matrix of this system and its reduced row echelon form are the following:




1 0 −3 1 0
−1 1
3 −3 0
 0 1 0 −2 0 
 0 −2 0
4 0 
,
.



 0 0 0
 1
1 −3 −1 0
0 0 
2
2 −6 −2 0
0 0 0
0 0
The free variables are x3 and x4 ; let x3 = s and x4 = t. Then the general solution of the system is
  

 


x1
3s − t
3
−1
 x2   2t 
 


=
 = s0 + t 2 .
~x = 
 x3   s 
1
 0 
t
0
1
x4
The vectors ~v1 = [ 3 0 1 0 ]t and ~v2 = [ −1 2 0 1 ]t form a basis for the kernel of T .
The image of T is spanned by the column vectors of A. Since the variables x3 and x4 of the
system A~x = 0 are free, the third and fourth column vectors of A are redundant vectors. Hence
the first two column vectors, namely w
~ 1 = [ −1 0 1 2 ]t and w
~ 2 = [ 1 − 2 1 2 ]t , form a basis for the
image of T .
Exercise 2 (1 pt.)
Find whether the following vectors are linearly independent:


 


4
3
1
~v1 =  −1  , ~v2 =  8  , ~v3 =  2 
3
1
−1
1
If they are linearly dependent, write explicitly a linear relation among them.
Solution
We want to find if there are numbers c1 , c2 , c3 ∈ R not all zero, such that
c1~v1 + c2~v2 + c3~v3 = 0
This is equivalent to
  


 



0
4
3
1
4c1 + 3c2 + c3
4c1 + 3c2 + c3 = 0 c1  −1  +c2  8  +c3  2  = 0 ⇔  −c1 + 8c2 + 2c3  =  0  ⇔ − c1 + 8c2 + 2c3 = 0 0
3
1
−1
3c1 + c2 − c3
3c1 + c2 − c3 = 0 The augmented matrix of this system is


4 3 1 0
 −1 8 2 0 
3 1 −1 0
We bring it to reduced row-echelon form as follows.




4 3 1 0
1 2 2 0
R1 →R1 −R3
R →R2 +R1
 −1 8 2 0  −
−−−−−−→  −1 8 2 0  −−2−−−−
−−→ · · ·
R3 →R3 −3R1
3 1 −1 0
3 1 −1 0




1 2
2 0
1 2
2 0
R2 →R2 /10
R →R −2R2
 0 10 4 0  −
−−−−−−→  0 1 2/5 0  −−1−−−1−−−→
···
R3 →R3 +5R2
0 −5 −7 0
0 −5 −7 0






1 0 6/5 0
1 0 6/5 0
1 0 0 0
R3 →R3 /(−5)
R1 →R1 −(6/5)R3
 0 1 2/5 0  −
−−−−−−−→  0 1 2/5 0  −−−−−−−−−−−→  0 1 0 0 
R2 →R2 −(2/5)R3
0 0 −5 0
0 0 1 0
0 0 1 0
The only solution to the system is c1 = c2 = c3 = 0. Therefore the vectors ~v1 , ~v2 , ~v3 are linearly
independent.
Exercise 3 (1 pt.)
Find which of the following subsets of R3 are vector subspaces.
(i) The set V = [ x1 x2 x3 ]t ∈ R3 : x1 + x2 + x3 = 1 .
(ii) The set V = [ x1 x2 x3 ]t ∈ R3 : x1 ≤ x2 ≤ x3 .
(iii) The set V = [ x1 x2 x3 ]t ∈ R3 : ~v · ~x = ~u · ~x , where ~v , ~u are two fixed vectors in R3 .
Solution
A subset of R3 is a linear subspace if it contains the zero vector and it is closed under vector
addition and scalar multiplication (Definition 3.2.1).
(i) This set is not a linear subspace of R3 . It does not contain the zero vector [ 0 0 0 ]t since
0 + 0 + 0 6= 1.
(ii) This set is not a linear subspace of R3 . It is not closed under scalar multiplication. The vector
~x = [ 1 2 3 ]t is in V since 1 ≤ 2 ≤ 3, but (−1)~x = [ −1 − 2 − 3 ]t is not since −1 < −2 < −3
is certainly false.
2
(ii) This set is a linear subspace of R3 . It contains the zero vector since ~v · 0 and ~u · 0 are both
equal to zero. If ~x, ~y ∈ V , so that ~v · ~x = ~u · ~x and ~v · ~y = ~u · ~y , then
~v · (~x + ~y ) = ~v · ~x + ~v · ~y = ~u · ~x + ~u · ~y = ~u · (~x + ~y )
and hence ~x + ~y is in V . So V is closed under vector addition. Now if ~x ∈ V and c ∈ R, so
that ~v · ~x = ~u · ~x, then
~v · (c~x) = c(~v · ~x) = c(~u · ~x) = ~u · (c~x)
so that c~x ∈ V . Therefore V is closed under scalar multiplication. We conclude that V is a
linear subspace of R3 .
Exercise 4 (1 pt.)
Let ~v1 , . . . , ~vm be vectors in Rn , and let ~v ∈ Rn be a vector which is a linear combination of them,
i.e. there are scalars c1 , . . . , cm ∈ R such that
~v = c1~v1 + · · · + cm~vm .
Show that Span(~v1 , . . . , ~vm , ~v ) = Span(~v1 , . . . , ~vm ).
Solution
Let ~x ∈ Span(~v1 , . . . , ~vm ). We can write ~x as a linear combination of the ~v1 , . . . , ~vm , say
~x = x1~v1 + · · · + xm~vm .
Since 0~v is the zero vector, we can also write ~x as
~x = x1~v1 + · · · + xm~vm + 0~v .
This means that ~x is a linear combination of ~v1 , . . . , ~vm , ~v and so an element of Span(~v1 , . . . , ~vm , ~v ).
We see that any vector in Span(~v1 , . . . , ~vm ) is also in Span(~v1 , . . . , ~vm , ~v ), and so Span(~v1 , . . . , ~vm )
is a subset of Span(~v1 , . . . , ~vm , ~v ).
Conversely, let ~x ∈ Span(~v1 , . . . , ~vm , ~v ). We can write ~x as a linear combination of the ~v1 , . . . , ~vm , ~v ,
say
~x = x1~v1 + · · · + xm~vm + y~v .
Since ~v = c1~v1 + · · · + cm~vm , we have
~x = x1~v1 + · · · + xm~vm + y~v = x1~v1 + · · · + xm~vm + y (c1~v1 + · · · + cm~vm )
= x1~v1 + · · · + xm~vm + yc1~v1 + · · · + ycm~vm = (x1 + yc1 ) ~v1 + · · · + (xm + ycm ) ~vm .
This means that ~x is a linear combination of ~v1 , . . . , ~vm and so an element of Span(~v1 , . . . , ~vm ). We
see that any vector in Span(~v1 , . . . , ~vm , ~v ) is also in Span(~v1 , . . . , ~vm ), and so Span(~v1 , . . . , ~vm , ~v ) is
a subset of Span(~v1 , . . . , ~vm ). We deduce that the two sets are equal, i.e.
Span(~v1 , . . . , ~vm , ~v ) = Span(~v1 , . . . , ~vm ).
Exercise 5 (1 pt.)
Let ~v1 , . . . , ~vm be vectors in Rn . Show that if some of them is zero, i.e. ~vk = 0 for some k, then the
~v1 , . . . , ~vm are linearly dependent.
3
Solution
We want to find a non-trivial linear relation that the ~v1 , . . . , ~vm satisfy, that is we want to find
scalars c1 , . . . , cm ∈ R not all zero, such that
c1~v1 + · · · + cm~vm = 0.
Let all ci be zero, except for ck , which is set to be ck = 1. Then ci~vi = 0 if i 6= k, since ci is zero in
that case, and ck~vk = 0 since ~vk = 0. Hence
c1~v1 + · · · + cm~vm = 0
as all terms in the left side of the equation are zero.
Exercise 6 (1 pt.)
Let ~v1 , . . . , ~vm ∈ Rn be vectors that are linearly dependent, and ~v ∈ Rn another vector. Show that
the ~v1 , . . . , ~vm , ~v are also linearly dependent.
Solution
Since the ~v1 , . . . , ~vm are linearly dependent, there are some scalars c1 , . . . , cm , not all of which are
zero, such that
c1~v1 + · · · + cm~vm = 0.
Clearly 0~v = 0, and so
c1~v1 + · · · + cm~vm + 0~v = 0.
This is a linear relation among ~v1 , . . . , ~vm . Not all c1 , . . . , cm , 0 are zero, hence the linear relation
is non-trivial and the ~v1 , . . . , ~vm , ~v are linearly dependent.
Exercise 7 (1 pt.)
Let ~v1 , . . . , ~vm ∈ Rn be vectors. Show that the ~v1 , . . . , ~vm are linearly dependent if and only if some
of them is a linear combination of the rest.
Solution
First we show that if the ~v1 , . . . , ~vm are linearly dependent, then some of them is a linear combination
of the rest. Assume that the ~v1 , . . . , ~vm are linearly dependent. There are some scalars c1 , . . . , cm ,
not all of which are zero, such that
c1~v1 + · · · + cm~vm = 0.
Pick some of the c1 , . . . , cm that is non-zero, say ck . Then we have
c1~v1 + · · · + cm~vm = 0 ⇒ −ck~vk = c1~v1 + · · · + ck−1~vk−1 + ck+1~vk+1 + · · · + cm~vm
1
(c1~v1 + · · · + ck−1~vk−1 + ck+1~vk+1 + · · · + cm~vm )
ck
c1
ck−1
ck+1
cm
⇒ ~vk = −
~v1 + · · · + −
~vk−1 + −
~vk+1 + · · · + −
~vm ,
ck
ck
ck
ck
⇒ ~vk = −
and hence ~vk is a linear combination of the rest.
Now we show that if some of the ~v1 , . . . , ~vm is a linear combination of the rest, then the ~v1 , . . . , ~vm
are linearly dependent. Say ~vk is a linear combination of the rest, and write
~vk = c1~v1 + · · · + ck−1~vk−1 + ck+1~vk+1 + · · · + cm~vm .
4
Bringing ~vk to the other side, we get
c1~v1 + · · · + ck−1~vk−1 − ~vk + ck+1~vk+1 + · · · + cm~vm = 0.
Not all of c1 , . . . , ck−1 , −1, ck+1 , . . . , cm are zero (as −1 6= 0), so that the above is a non-trivial
relation among ~v1 , . . . , ~vm . We conclude that the ~v1 , . . . , ~vm are linearly dependent.
Exercise 8 (1 pt.)
Let V be a subspace of Rn and let {~v1 , . . . , ~vm } be a basis for it. Show that any vector in V can
be written as a linear combination of the ~v1 , . . . , ~vm in a unique way.
Solution
Let ~v be an arbitrary vector in V . Since the ~v1 , . . . , ~vm form a basis for V , V = Span(~v1 , . . . , ~vm ).
In particular ~v is in Span(~v1 , . . . , ~vm ) and so it is a linear combination of the ~v1 , . . . , ~vm , say
~v = c1~v1 + · · · + cm~vm
for some scalars c1 , . . . , cm . We claim that this is the only way that ~v can be written as a linear
combination of the ~v1 , . . . , ~vm . Assume that there is another such linear combination, say
~v = c01~v1 + · · · + c0m~vm .
for some scalars c01 , . . . , c0m . Subtracting the two equations for ~v , we get
c1~v1 + · · · + cm~vm − c01~v1 + · · · + c0m~vm = ~v − ~v ⇒ (c1 − c01 )~v1 + · · · + (cm − c0m )~vm = 0.
The latter is a linear relation among the ~v1 , . . . , ~vm . Since they form a basis for V , they are linearly
independent and so the linear relation has to be trivial, i.e. all the coefficients of the ~vi must be
zero. This means
c1 − c01 = 0, . . . , cm − c0m = 0,
or equivalently
c1 = c01 , . . . , cm = c0m .
Therefore the two expressions of ~v as a linear combination of ~v1 , . . . , ~vm are the same. We deduce
that any vector in V can be written as a linear combination of the ~v1 , . . . , ~vm in a unique way.
5