Exercises for the course
Machine Learning 1
Winter semester 2021/22
fu ̈r Softwaretechnik und theoretische ̈at IV, ̈at Berlin Prof. Dr. Klaus- ̈ller Email:
Copyright By PowCoder代写 加微信 powcoder
Exercise Sheet 7
A kernel function k : Rd × Rd → R must satisfy the Mercer’s condition, which verifies that for any sequence of data points x1,…,xn ∈ Rd and coefficients c1,…,cn ∈ R the inequality
ci cj k(xi,xj) ≥ 0
is satisfied. If it is the case, the kernel is called a Mercer kernel.
Conversely, the representer theorem states that if k is a Mercer kernel on Rd, then there exists a Hilbert space (i.e., a finite or infinite dimensional R-vector space with norm and scalar product) F, the so-called feature space, and a continuous map φ : Rd → F, such that
k(x,x′) = ⟨φ(x), φ(x′)⟩F for all x,x′ ∈ Rd. Exercise 1: (3 × 20 P)
(a) Show that the following are Mercer kernels.
i. k(x,x′)=⟨x,x′⟩
ii. k(x,x′)=f(x)·f(x′) wheref:Rd →Risanarbitrarycontinuousfunction
(b) Let k1,k2 be two Mercer kernels, for which we assume the existence of a finite-dimensional feature map
associated to them. Show that the following are again Mercer kernels.
i. k(x,x′)=k1(x,x′)+k2(x,x′)
ii. k(x,x′)=k1(x,x′)·k2(x,x′)
(c) Show using the results above that the polynomial kernel of degree d, where k(x, x′) = (⟨x, x′⟩ + θ)d and
θ ∈ R+, is a Mercer kernel.
Exercise 2: The Feature Map (4 × 10 P)
Consider the homogenous polynomial kernel k of degree 2 which is k : R2 × R2 → R, where 2 2 2
k(x,y)=⟨x,y⟩ =
3 x1√x21
(a) Show that F = R and φ x = 2 x1 x2 are possible choices for feature space and feature map.
cosθ 3
(b)ConsidertheunitcircleC= sinθ ;0≤θ<2π .Showthattheimageφ(C)liesonaplaneHinR. t
(c) ConsidertheplaneA= s ; t,s∈R . FindapointP inF whichisnotcontainedinφ(A).
d d 2 d 2
(d) Findafeaturemapassociatedtothekernelk:R ×R →Rwithk(x,y)=⟨x,y⟩ =
程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com