程序代写代做代考 AI Math 215.01, Spring Term 1, 2021 Writing Assignment #5

Math 215.01, Spring Term 1, 2021 Writing Assignment #5
Turn your 􏰁le in on Pweb in pdf format under \Writing Assign- ments”. There is no need to put your name on the document since I will grade anonymously, and Pweb will keep track of the authorship of the document.
You will be graded on your writing (use of quanti􏰁ers, state- ment and use of de􏰁nitions, and other mathematical language) as well as the validity and completeness of your mathematical arguments.
1. In our de􏰁nition of the elementary row operation known as \row combination”, we replace row i by the sum of itself and a multiple of a di􏰀erent row j (where di􏰀erent means that j 6= i). Suppose then that we consider the operation where we take a row i and replace it by the sum of itself and a multiple of row i. Do we necessarily preserve the solution set of the system by doing this? As always, you must explain if your answer is yes, or you must provide a speci􏰁c counterexample (with justi􏰁cation) if your answer is no.
Let a system of equations be:
x+y=3
2x + 3y = 8:
The matrix of this system:

113
 .
238

By replacing R2 with R2 + (􏰃2) 􏰄 R1:

113 113
  =   .
238 012
Then we have a system of equation with the same solution
set as the orginal system:
x+y=3
y = 2:
Proposition 4.2.12 tells us that if a matrix is in echelon
form and every column, except the last column, has a lead-
ing entry, then the system is consistent and has a unique
solution. Since we have the matrix in elchelon form and
every column, except the last column, has a leading entry,
we know the system is consistent and has a unique solu-
tion. After solving the system, we have x = 1 and y =
2 as a unique solution to the system. The system has a
unique solution set:f(1; 2)g. Now notice if we replace R2
by R2 +(􏰃1)􏰄R2, we have

113 113
   becomes to  .
238 000 Notice that we have a new system:
x + y = 3.

Then we have x = 3 􏰃 y . Proposition 4.2.12 also tells us that if the last column of the echelon form matrix contains no leading entry and there is at least one other column without a leading entry, then the system is consistent and has in􏰁nitely many solutions. Since the matrix is in eche- lon form and the last column and the second column has no leading entry, we know the matrix is consistant and has in􏰁nit solutions. Let t 2 R be arbitrary such that x=3-t
y = t.

3 􏰃1 The solution set for the new system is f  + t   :
01
t 2 Rg. Since the solution set is not preserved by replacing
R2 by the sum of R2 and (􏰃1) 􏰄 R2 in this example, we can say the operation where we take a row i and replace it by the sum of itself and a multiple of row i not necessarily preserve the solution set of the system.

2. Let V be a vector space. Suppose that U and W are both subspaces of V . We have already proven that the union U [ W is not necessarily a subspace of V . Now, de􏰁ne U + W as
f~v 2 V : there exists u~ 2 U and w~ 2 W with ~v = u~ + w~ g:
Thatis,U+W isthesetofallvectorsinV thatcanbe written as the sum of an element of U and an element of W. Show that U +W is a subspace of V.
De􏰁nition 4.1.12: Let K be a vector space. A subspace of K is a subset Z 􏰇 K with the following properties:
1.~0K 2Z.
2. For all z~1; z~2 2 Z, we have z~1 + z~2 2 Z.
3. For all z~ 2 Z and all m 2 R, we have m 􏰄 z~ 2 Z.
Let V be a vector space and let U and W be subspaces of V. LetU+W bethesetofallvectorsinV thatcanbe written as the sum of an element of U and an element of W. Since U and W are subspaces and de􏰁nition 4.1.12, property 1, tells us the zero vector of the vector space is also in its subspaces, we know ~0V 2 U and ~0V 2 W . Since V is a vector space, we know ~0V 2 V . Now notice ~0V +~0V = ~0V . Since ~0V 2 U and ~0 2 W and according to de􏰁nition

ofU+W,wehaveshown~0V 2U+W.
Let~x,y~2U+Wbearbitraryand􏰁xu~;u~ 2Uand 12
w~;w~ 2Wsuchthat~x=u~+w~ andy~=u~+w~.Ac- 121122
cording to commutativity and associativity of addition, we
have ~x + y~ = (u~ +w~ ) + (u~ +w~ ) = (u~ +u~ )+(w~ +w~ ). 11221212
Since U and W are subspaces of V and de􏰁nition 4.1.12, property 2, tells us subspaces are closed under vector ad- dtion, we know u~ + u~ 2 U and w~ + w~ 2 W. Since
u~ + u~ 2 U and w~ + w~ 2 W and according to the de􏰁- 12 12
nitionofU+W,weknow(u~ +u~)+(w~ +w~)2U+W 1212
whichindicates~x+y~2U+W. Since~x,y~2U+W are arbitrary, we have shown U + W is closed under vector addition.
Let ~h 2 U + W be arbitrary and 􏰁x u~ 2 U and w~ 2 W
33
suchthat~h=u~ +w~. Letc2Rbearbitrary. Ac- 33
cording to distributivity of multiplication. we have c􏰄 ~h =
c􏰄(u~ +w~ ) = c􏰄u~ +c􏰄w~ . Since U and W are subspaces 3333
of V and de􏰁nition 4.1.12, property 3, tells us subspaces
are closed under scalar multiplication, we know c 􏰄 u~ 2 U 3
andc􏰄w~ 2W. Sincec􏰄u~ 2U andc􏰄w~ 2W,wehave 333
12 12

c􏰄u~ +c􏰄w~ 2U+W whichindicatesc􏰄~h2U+W. 33
Since c 2 R is arbitrary and ~h 2 U + W is arbitrary, we know U + W is closed under scalar multiplication. Since U and W are subspaces of V and all three properties listed in de􏰁nition 4.1.12 are satis􏰁ed by U + W , we can conclude U + W is a subspace of V .

3. Let V be a vector space. Let
u~1;u~2;:::;u~n;w~1;w~2;:::;w~m 2 V:
Assume that (u~1; u~2; : : : ; u~n) is linearly dependent. Show that (u~1;u~2;:::;u~n;w~1;w~2;:::;w~m) is linearly dependent. Make sure that the de􏰁nition of linearly dependent (not just \not linearly independent”) is used explicitly in your answer.
De􏰁nition of linearly dependent: Let K be a vector space and suppose ~x1;~x2;:::;~xn 2 K. If (~x1;~x2;:::;~xn) is a lin- early dependent sequence, then there exists a sequence of coe􏰊cients a1a2;:::;an 2 R, where at least one ai is nonzero in the sequence of coe􏰊cients a1a2; : : : ; an 2 R and such that a1~x1 +a2~x2 +􏰄􏰄􏰄+an~xn =~0K.
LetV beavectorspace. Letu~1;u~2;:::;u~n;w~1;w~2;:::;w~m 2
V: Let (u~1; u~2; : : : ; u~n) be linearly dependent. Since (u~1; u~2; : : : ; u~n) is linearly dependent and according to the de􏰁nition of lin- earlydependent,weknowifc1u~1;c2u~2;:::;cnu~n =~0V,then
there exists at least one ci is nonzero in the sequence of coe􏰊cients c1; c2; : : : ; cn 2 R. Since 0 times any vector is
equal to ~0, we know

~0 = c 1 u~ 1 + c 2 u~ 2 + 􏰄 􏰄 􏰄 + c n u~ n
=c1u~1 +c2u~2 +􏰄􏰄􏰄+cnu~n +0(w~1)+0(w~2)+􏰄􏰄􏰄+0(w~m)
Letcn+1 =cn+2 =􏰄􏰄􏰄=cn+m =0. Thenwehave
~0V =c1u~1 +c2u~2 +􏰄􏰄􏰄+cnu~n +0(w~1)+0(w~2)+􏰄􏰄􏰄+0(w~m) =c1u~1 +c2u~2 +􏰄􏰄􏰄+cnu~n +cn+1w~1 +cn+2w~2 +􏰄􏰄􏰄+cn+mw~m:
Since~0V =c1u~1 +c2u~2 +􏰄􏰄􏰄+cnu~n +cn+1w~1 +cn+2w~2 + 􏰄􏰄􏰄+cn+m(w~m)andthereexistsatleastoneci isnonzero in the sequence of coe􏰊cients c1; c2; : : : ; cn 2 R, then we know there exists at least one ci is nonzero in the sequence of coe􏰊cients c1;c2;:::;cn;cn+1;cn+2;:::;cn+m 2 R such that ~0 = c1u~1 + c2u~2 + 􏰄􏰄􏰄 + cnu~n + cn+1w~1 + cn+2w~2 + 􏰄􏰄􏰄+cn+mw~m. Therefore, according to de􏰁nition of linearly dependent, we have shown (u~1; u~2; : : : ; u~n; w~1; w~2; : : : ; w~m) is linearly dependent.

4. Let V be a vector space and let u~;~v;w~ 2 V . Assume that (u~;~v;w~) is linearly independent. Show that
(u~+~v;u~+w~;~v +w~)
is linearly independent. Make sure that the de􏰁nition of linearly independent is used explicitly in your answer.
Hint: Don’t forget what we have learned about systems of equations|systems of equations might be helpful for a complete answer here.
De􏰁nition of linearly independent: Let K be a vector space and suppose ~k1;~k2;:::;~kn 2 K. We say that (~k1 :::;~kn) is a linearly independent sequence if the following statement holds: Foralla;:::;a 2R,ifa~k +􏰄+a~k =~0 ,
thena1 =􏰄􏰄􏰄=an =0. LetV beavectorspaceand let u~;~v;w~ 2 V be arbitrary such that (u~;~v;w~) is linearly independent. Let’s 􏰁x a; b; c 2 R such that a(u~ + ~v ) + b(u~ + w~) + c(~v + w~) = ~0V . By distributivity of multi- plication, we have a(u~ + ~v) + b(u~ + w~) + c(~v + w~) = au~+a~v +bu~+bw~ +c~v +cw~ = ~0V. By commutativ- ity of addition and distributivity of multiplication, we have au~+a~v +bu~+bw~ +c~v +cw~ =~0V = (a+b)u~+(a+ c)~v+(b+c)w~ =~0V . Since(u~;~v;w~)islinearlyindependent and according to the de􏰁nition of linearly independent, we
1n11nnK

know (a+b) = (a+c) = (b+c) = 0 is a unique solution for(a+b)u~+(a+c)~v+(b+c)w~ =~0V. Thenwehave the following system:
a+b=0
a+c=0
b+c=0
Matrix of this system:

1100

1 0 1 0. 

0110
By replacing R2 with R2 + (􏰃1) 􏰄 R1, we have

1100 1100

1 0 1 0=0 􏰃1 1 0 

0110 0110
By replacing R3 with R3 + R2, we have

1100 1100

0 􏰃1 1 0=0 􏰃1 1 0. 

0110 0020
Now we have a system with the same solution set as the
orginal system: a+b=0

􏰃b + c = 0
2c = 0:
Proposition 4.2.12 tells us that if a matrix is in echelon form and every column, except the last column, has a leading entry, then the system is consistent and has a unique solu- tion. Since we have the matrix in elchelon form and every column, except the last column, has a leading entry, we know the system is consistent and has a unique solution. After solving the system, we have a = b = c = 0 and we know a = b = c = 0 is a unique solution to the system according to proposition 4.2.12. Since a = b = c = 0 is the unique solution for a(u~+~v)+b(u~+w~)+c(~v +w~) = ~0V and u~; ~v ; w~ 2 V are arbitrary such that (u~; ~v ; w~ ) is linearly independent, we have shown (u~ + ~v ; u~ + w~ ; ~v + w~ ) is linearly independent.

5. Comment on working with partner(s): Comment on the work you and your partner(s) accomplished together and what you accomplished apart.