Analytic Geometry 1
Australian National University
3.1 Norms
• A norm on a vector space 𝑉 is a function
% ∶𝑉→R,
𝒙↦𝒙,
which assigns each vector 𝒙 its length 𝒙 ∈ R.
Examples
• The Manhattan norm on R! is defined for 𝒙 ∈ R! as 𝑥. $ 𝑥.
𝒙 ! ∶= $ 𝑥” , “#!
⋮ 𝑥1
where % is the absolute value. It is also called l” norm. • The Euclidean norm of 𝒙 ∈ R! is defined as
/
𝒙+∶= ‘𝑥,+= 𝒙0𝒙 ,-.
It is the Euclidean distance of 𝒙 from the origin; also called l# norm
3.1 Norms
For all 𝜆 ∈ R, and 𝒙, 𝒚 ∈ 𝑉 the following holds:
• Absolutely homogeneous: 𝜆𝒙 = 𝜆 𝒙
• Triangle inequality: 𝒙 + 𝒚 ≤ 𝒙 + 𝒚
• Positivedefinite: 𝒙 ≥0and 𝒙 =0⟺𝒙=𝟎
3.2.1 Dot Product
• Scalar product/dot product in R! is given by
+
𝒙’𝒚 = $𝑥(𝑦(
()* 1×𝑛 n×1
Bilinear mapping
• A bilinear mapping Ω is a mapping with two arguments, and it is linear in each argument. Consider a vector space 𝑉, for all 𝒙,𝒚,𝒛 ∈ 𝑉,𝜆,𝜑 ∈ R,
Ω 𝜆𝒙+𝜑𝒚,𝒛 =𝜆Ω 𝒙,𝒛 +𝜑Ω 𝒚,𝒛 Ω 𝒙,𝜆𝒚+𝜑𝒛 =𝜆Ω 𝒙,𝒚 +𝜑Ω 𝒙,𝒛 .
Ω is linear in the first argument
Ω is linear in the second argument
Inner product
• Let 𝑉 be a vector space and Ω: 𝑉×𝑉 → R be a bilinear mapping.
• Ω is called symmetric if Ω 𝒙, 𝒚 = Ω 𝒚, 𝒙
• Ω is called positive definite if
∀𝒙∈𝑉∖ 𝟎 ∶Ω 𝒙,𝒙 >0, Ω 𝟎,𝟎 =0
• A positive definite, symmetric bilinear mapping Ω: 𝑉×𝑉 → R is called an inner
product on 𝑉. We write 𝒙,𝒚 instead of Ω 𝒙,𝒚 .
• The pair 𝑉, 8,8 is called is called an inner product vector space. If we use the dot product, we call 𝑉, 8,8 a Euclidean vector space.
Example
• Consider 𝑉 = R’. If we define
𝒙,𝒚 ∶=𝑥”𝑦”− 𝑥”𝑦#+𝑥#𝑦” +2𝑥#𝑦#
• then %,% is an inner product but different from the dot product.
This mapping is symmetric: it is easy to derive 𝒙, 𝒚 = 𝒚, 𝒙 Is it positive definite? % %
% % ∀𝒙∈𝑉\𝟎,𝒙,𝒙=𝑥!−𝑥!𝑥%+𝑥%𝑥! +2𝑥%=𝑥!−𝑥% +𝑥%>0
3.2.3 Symmetric, Positive Definite Matrices
• Consider an 𝑛-dimensional vector space 𝑉 with an inner product 8,8 : 𝑉×𝑉 →
R, and a basis 𝐵 = 𝒃!,⋯,𝒃$ of 𝑉. ////
𝒙 , 𝒚 = ‘ 𝜑 , 𝒃 , , ‘ 𝜆 B 𝒃 , = ‘ ‘ 𝜑 , 𝒃 , , 𝒃 B 𝜆 B = 𝒙3 C 𝑨 𝒚3 ,-. B-. ,-. B-.
where 𝐴”& ≔ 𝒃” , 𝒃& and 𝒙H, 𝒚H are the coordinates of 𝒙, 𝒚 with respect to the basis 𝐵.
• The inner product 8,8 is uniquely determined through 𝑨. The symmetry of the inner product also means that 𝑨 is symmetric.
• The positive definiteness of the inner product implies that
∀𝒙∈𝑉∖ 𝟎 ∶ 𝒙,𝒙 =𝒙’𝑨𝒙>0
3.2.3 Symmetric, Positive Definite Matrices
• Asymmetricmatrix𝑨∈R$×$ thatsatisfies∀𝒙∈𝑉∖ 𝟎 ∶𝒙’𝑨𝒙>0iscalled symmetric, positive definite, or just positive definite. If only ≥ holds, then 𝐴 is called symmetric, positive semidefinite.
• Example
𝑨!=31, 𝑨%=13 14 33
• 𝑨! is positive definite because it is symmetric and
3 1 𝑥! 1 4 𝑥%
𝒙)𝑨!𝒙=𝑥! 𝑥%
=3𝑥!%+2𝑥!𝑥%+4𝑥% =(𝑥!+𝑥%)% +2𝑥!%+3𝑥% >0
forall𝒙∈𝑉∖ 𝟎 .
• 𝑨% is symmetric but not positive definite
𝒙)𝑨%𝒙 = 𝑥!% + 6𝑥!𝑥% + 3𝑥% = (𝑥! + 3𝑥%)% −6𝑥% can be less than 0
3.2.3 Symmetric, Positive Definite Matrices
• For a real-valued, finite-dimensional vector space 𝑉 and a basis 𝐵 of 𝑉, it holdsthat 8,8:𝑉×𝑉→Risaninnerproductifandonlyifthereexistsa symmetric, positive definite matrix 𝑨 ∈ R$×$ with
𝒙 , 𝒚 = 𝒙H ) 𝑨 𝒚H
• If 𝑨 ∈ R$×$ is symmetric and positive definite,
the diagonal elements 𝑎”” of 𝑨 are positive because 𝑎”” = 𝒆'” 𝑨𝒆” = 𝒆” , 𝒆” >
0, where 𝒆” is the 𝑖th vector of the standard basis in R$.
3.3 Lengths and Distances • Any inner product induces a norm
𝒙 ∶= 𝒙,𝒙
• Cauchy-
• For an inner product vector space 𝑉, %,% the induced norm % satisfies the Cauchy-Schwarz inequality
𝒙,𝒚 ≤ 𝒙 𝒚
Example – Lengths of Vectors Using Inner Products
• We can now use an inner product to compute vector lengths, using 𝒙 ∶= 𝒙,𝒙 . Consider 𝒙 = 1,1 ‘ ∈ R%. If we use the dot product as the inner
product, we obtain
as the length of 𝒙. Let us now choose a different inner product:
𝒙,𝒚 ≔𝒙C 1 −12 𝒚=𝑥.𝑦.−1 𝑥.𝑦++𝑥+𝑦. +𝑥+𝑦+ −121 2
With this inner product, we obtain
𝒙,𝒙 =𝑥!%−𝑥!𝑥%+𝑥%=1−1+1=1⟹ 𝒙 = 1=1 𝒙 is “shorter” with this inner product than with the dot product.
𝒙 = 𝒙)𝒙= 1%+1%= 2
𝒙C 1 0 𝒚 is dot product 01
3.3 Lengths and Distances
• Consider an inner product space 𝑉, 8,8 , then
𝑑 𝒙,𝒚 ≔ 𝒙−𝒚 = 𝒙−𝒚,𝒙−𝒚
is called the distance between 𝒙 and 𝒚 for 𝒙, 𝒚 ∈ 𝑉.
• If we use the dot product as the inner product, then the distance is called Euclidean distance.
3.3 Lengths and Distances
• The mapping
𝑑 ∶ 𝑉×𝑉 → R
(𝒙, 𝒚) ↦ 𝑑(𝒙, 𝒚)
is called a metric.
• A metric 𝑑 satisfies the following:
• 𝑑ispositivedefinite,i.e.,𝑑 𝒙,𝒚 ≥0forall𝒙,𝒚∈𝑉and𝑑 𝒙,𝒚 =0⇔𝒙=𝒚
• 𝑑issymmetric,i.e.,𝑑 𝒙,𝒚 =𝑑 𝒚,𝒙 forall𝒙,𝒚∈𝑉
• Triangleinequality:𝑑 𝒙,𝒛 ≤𝑑 𝒙,𝒚 +𝑑 𝒚,𝒛 forall𝒙,𝒚,𝒛∈𝑉
• Very similar 𝒙 and 𝒚 will result in a large value for the inner product and a small value for the metric.
3.4 Angles and Orthogonality
• According to Cauchy-Schwarz inequality, assume 𝒙 ≠ 𝟎, 𝒚 ≠ 𝟎. Then,
−1≤ 𝒙,𝒚 ≤1 𝒙𝒚
Therefore,thereexistsaunique𝜔∈ 0,𝜋,with
𝒙,𝒚 𝒙𝒚
The number 𝜔 is the angle between the vectors 𝒙 and 𝒚.
• The angle between two vectors tells us how similar their orientations are.
• Using the dot product, the angle between 𝒙 and 𝒚 = 𝟒𝒙 is 0, so their orientation is the same.
𝒙,𝒚 ≤ 𝒙 𝒚
𝑐𝑜𝑠𝜔 =
Example (Angle between Vectors)
• Let us compute the angle between 𝒙𝒙 = 1,1 G ∈ R# and 𝒚𝒚 =
1,2 G ∈ R#. We use the dot product as the inner product. We get 𝑐𝑜𝑠𝜔= 𝒙,𝒚 =𝒙C𝒚=3
• and the angle between the two vectors is arccos H ≈ 0.32rad, which corresponds to about 18°. “I
• We then use inner product to characterize orthogonality.
𝒙,𝒙 𝒚,𝒚 𝒙C𝒙𝒚C𝒚 10
3.4 Angles and Orthogonality
• Twovectors𝒙and𝒚areorthogonalifandonlyif 𝒙,𝒚 =0,andwewrite𝒙⊥
= 𝒚 = 1, i.e., the vectors are unit vectors, then 𝒙 and • 𝟎-vector is orthogonal to every vector in the vector space
𝒚. If additionally 𝒙 𝒚 are orthonormal.
• Example (Orthogonal Vectors)
• Consider𝒙= 1,1 ‘ and𝒚= −1,1 ‘
• Using dot product as inner product, we have • 𝒙,𝒚 =0,so𝒙⊥𝒚.
• if we choose the inner product
𝒙,𝒚 =𝒙C 2 0 𝒚 01
• the angle 𝜔 between 𝒙 and 𝒚 is given by
𝑐𝑜𝑠𝜔= 𝒙,𝒚 =−1 ⟹ 𝜔≈1.91rad≈109.5∘ 𝒙𝒚3
3.4 Angles and Orthogonality
• A square matrix 𝑨 ∈ R!×! is an orthogonal matrix if and only if its columns are orthonormal, such that
𝑨𝑨T =𝑰=𝑨T𝑨
which implies that
𝑨U” = 𝑨T
i.e., the inverse is obtained by simply transposing the matrix • The length of a vector 𝒙 is not changed when transforming it
using an orthogonal matrix 𝑨. For dot product, we obtain ‖𝑨𝒙‖ # = (𝑨𝒙)T(𝑨𝒙)=𝒙T𝑨T𝑨𝒙 = 𝒙T𝑰𝒙 = 𝒙T𝒙= 𝒙 #
3.4 Angles and Orthogonality
• The angle between any two vectors 𝒙 and 𝒚 as measured by their inner product, is also unchanged when transforming both of them using an orthogonal matrix 𝑨. We use the dot product as inner product
𝑐𝑜𝑠𝜔 = 𝑨𝒙 T(𝑨𝒚) = 𝒙T𝑨T𝑨𝒚 = 𝒙T𝒚
𝒙 𝒚
• Orthogonal matrices 𝑨 with 𝑨U” = 𝑨T preserve both angles and
distances.
• Orthogonal matrices define transformations that are rotations
𝑨𝒙 𝑨𝒚 𝒙T𝑨T𝑨𝒙𝒚T𝑨T𝑨𝒚
3.5 Orthonormal Basis
• Consideran𝑛-dimensionalvectorspace𝑉andabasis{𝒃!,…,𝒃*}of𝑉.For
all 𝑖,𝑗 = 1,⋯,𝑛, if
𝒃”,𝒃& =0 for 𝑖≠𝑗 (1)
𝒃”,𝒃” =1
then the basis is called an orthonormal basis (ONB).
If only (1) is satisfied, the basis is called an orthogonal basis.
Example (Orthonormal Basis)
• The canonical/standard basis for a Euclidean vector space R$ is an orthonormal basis, where the inner product is the dot product of vectors
𝒃.=11,𝒃+=1 1 2 1 2 −1
• In R% , the vectors
form an orthonormal basis since 𝑏!+𝑏% = 0 and 𝑏! = 1 = 𝑏%
3.6 Orthogonal Complement
• We now look at vector spaces that are orthogonal to each other
• Consider a 𝐷-dimensional vector space 𝑉 and an 𝑀-dimensional subspace 𝑈 ⊆ 𝑉. The orthogonal complement 𝑈V is a (𝐷 − 𝑀)- dimensional subspace of 𝑉 and contains all vectors in 𝑉 that are orthogonal to every vector in 𝑈.
• 𝑈∩𝑈V ={𝟎}sothatanyvector𝒙 ∈ 𝑉 canbeuniquely decomposed into Y [UY
𝑥 = a λ W 𝒃 W + a ψ Z 𝒃 ZV , λ W , ψ Z ∈ R
WX” ZX”
• Where 𝒃 ,…,𝒃 isabasisof𝑈and 𝒃V,…,𝒃V isabasisof 𝑈V. ” Y ” [UY
3.6 Orthogonal Complement
• A hyperplane 𝑈 in a three-dimensional vector space can be described by its normal vector, which spans its orthogonal complement 𝑈V
• Generally, orthogonal complements can be used to describe hyperplanes in 𝑛-dimensional vector and affine spaces
3.7 Inner Product of Functions
• We can think of a vector 𝒙 ∈ R$ as function with 𝑛 function values.
• The concept of an inner product can be generalized to vectors with an infinite number of entries (countably infinite) and also continuous-valued functions (uncountably infinite).
• Then the sum over individual components of vectors turns into an integral.
• Aninnerproductoftwofunctions𝑢∶R →Rand𝑣∶R →Rcanbedefinedas
the definite integral
–
< 𝑢, 𝑣 >≔ h 𝑢 𝑥 𝑣 𝑥 𝑑𝑥 ,
for lower and upper limits 𝑎, 𝑏 < ∞
• If the above equation evaluates to 0, the functions 𝑢 and 𝑣 are orthogonal.
3.7 Inner Product of Functions • Example
• 𝑢=sin 𝑥 ,𝑣=cos 𝑥 ,theintegrand𝑓 𝑥 =sin 𝑥 cos 𝑥 .This functionisodd,i.e.,𝑓𝑥 =−𝑓−𝑥.
• Therefore, the integral with limits 𝑎 = −𝜋, 𝑏 = 𝜋 of this product evaluates to 0. Therefore, sin and cos are orthogonal functions.
Check your understanding
(A) Norm characterises the length of a vector.
(B) The norm of a vector is a complex number.
(C)The inner product assigns each vector a real number.
(D) A metric characterises the similarity between two vectors. (E) Any bilinear mapping introduces an inner product
(F) Any inner product introduces a norm
(G)Any vector in 𝑈V is orthogonal to any vector in 𝑈.
(H)In R# there can be infinitely many bases, but only a finite • number of orthogonal / orthonormal bases