CS计算机代考程序代写 algorithm The Austalian National University Semester 2, 2021

The Austalian National University Semester 2, 2021
School of Computing Tutorial 3
Liang Zheng

COMP3670/6670: Introduction to Machine Learning

Exercises with a ! denote harder ones, !! denotes very difficult, and !!! denotes optional challenge
exercises.

This tutorial will be primarily about proofs in analytic geometry. There are far too many exercises to
do in the 2 hours, so you should choose some particular ones to work on. Your tutor will present some
in class, and feel free to post partial solutions on Piazza if you get stuck.

Question 1 Properties of the zero vector

Show that for any vector space V with any inner product 〈·, ·〉 : V × V → R, we have that 0 (the zero
vector) is orthogonal to every vector v ∈ V .
Also, show that for any vector v ∈ V, that {v,0} forms a linearly dependant set.

Question 2 Inner products

Prove that the standard Euclidean inner product on R2 given by

x · y := x1y1 + x2y2

is an inner product.

Question 3 Pythagorus

We have that any inner product induces a norm,

‖x‖ :=

〈x,x〉

Show that for two orthogonal vectors x and y (that is 〈x,y〉 = 0 ) that the following holds

‖x‖2 + ‖y‖2 = ‖x + y‖2

(This is a extension of Pythagorus’ Theorem, that for a right angled triangle with hypotenuse of length
c, and two other sides of length a and b, that a2 + b2 = c2.)

Question 4 ! Parseval’s Identity

Let V be a vector space, together with an inner product 〈·, ·〉 : V × V → R. Given a set of orthogonal
vectors {x1, . . . , xn} , show that ∣∣∣∣∣

∣∣∣∣∣
n∑

i=1

xi

∣∣∣∣∣
∣∣∣∣∣
2

=

n∑
i=1

‖xi‖2

Question 5 Norms

1. Prove that the Manhatten norm (l1 norm ) on R2 defined by

‖x‖1 := |x1|+ |x2|

is a norm. (You will need the triangle inequality on R, |a+ b| ≤ |a|+ |b|, to help you.)

2. ! Prove that the supremum norm ( l∞ norm) on R2 defined by

‖x‖∞ := max (|x1| , |x2|)

is a norm. (Hint: You will need triangle inequality on R, and the property that if A ⊆ B, then
maxx∈A f(x) ≤ maxx∈B f(x).)

1

Question 6 ! Basis of a vector space

Let V be a finite dimensional vector space, and let B = {b1, . . . ,bn} be a basis for V. Suppose that
for any two basis vectors bi and bj , we can compute the inner product 〈bi,bj〉 . Then, show that for
any two vectors u,v in V, we can express the inner product 〈u,v〉 in terms of the inner product of
basis vectors 〈bi,bj〉
(Hint: Use the fact that B spans the space V .)

Question 7 Orthogonal matrices preserve angles and norms

Suppose we are in the vector space Rn, together with the standard Euclidean dot product, that is

〈x,y〉 = x · y := xTy

Let
‖x‖2 =


x · x.

Let A ∈ Rn×n be an orthogonal matrix (that is, A−1 = AT .
)

Show that for any vector x ∈ Rn that
‖Ax‖2 = ‖x‖2

Using the above result (or otherwise), show that if the angle between two vectors x and y is θ then
the angle between Ax and Ay is either θ, or −θ (modulo 2π).
Given an example of an orthogonal matrix A ∈ Rn×n and vectors x,y ∈ Rn, such that the angle
between x and y is not the same as the angle between Ax and Ay.

Question 8 Rotation matrices preserve norms

Given a vector x ∈ R2 and the rotation matrix

R(θ) =

[
cos θ − sin θ
sin θ cos θ

]
Show that for any angle of rotation θ, we have

‖x‖2 = ‖R(θ)x‖2

Question 9 Gram-Schmidt

Let e1 =

[
1
0

]
and e2 =

[
0
1

]
, the standard basis vectors for R2. Let v be any vector in R2.

Define the projection operator

proju(v) :=
〈v,u〉
〈u,u〉

u

(if u = 0, then we define proj0(v) = 0.

The Gram-Schmidt algorithm takes a set of vectors v1, . . . ,vn and proceeds as follows:

u1 = v1
u2 = v2 − proju1 (v2)
u3 = v3 − proju1 (v3)− proju2 (v3)
. . . = . . .

un = vn −
∑n−1

j=1 projuj (vn)

The output {u1, . . . ,un} is a set of orthonormal vectors that spans the same set as {v1, . . . ,vn} (If
the dimension of the space spanned by the vi’s is less than n, then some of the ui’s will be zero.)

2

Suppose we are considering vectors in the vector space of R2.
Show that if we input {v1,v2,v3} = {e1, e2,v} to the Gram-Schmidt algorithm, the output is
{u1,u2,u3} = {e1, e2,0}

Question 10 !!! Cauchy-Schwartz

Prove the Cauchy-Schwartz inequality for a general inner product and corresponding induced norm:

〈u,v〉 ≤ ‖u‖‖v‖

(Hint: Let z = u− 〈u,v〉〈v,v〉v, and start with the fact that 〈z, z〉 ≥ 0.
)

3