程序代写代做 kernel C Homework 1 Statistical Machine Learning II, Semeter B 2019/2020

Homework 1 Statistical Machine Learning II, Semeter B 2019/2020
Notes: Please upload all your code with your assignment on Canvas before 7pm on Feb 26, 2020. Homework must be neatly written-up or typed for submission. I reserve the right to refuse homework that is deemed (by me) to be excessively messy.
1. Equivalence between SVM and penalized minimization. Prove that the SVM max M
w,b,∥w∥=1
subject to yi(w⊤xi + b) ≥ M(1 − ξi)
ξi≥0, 􏰄ξi≤C i
is equivalent to the penalized minimization problem:
n
min 􏰄 􏰀1 − yi(w⊤xi + b)􏰁 + λ∥w∥2
w,b
where(1−z)+ isthehingeloss(1−z)+ =max(0,1−z).
2. Linearly separable datasets. Consider two sets of points in Rp, X 0 = {x01, . . . , x0N0 } ⊆ Rp and X1 = {x1,…,x1N1} ⊆ Rp. If X0 and X1 are linearly separable, then it is possible to draw a hyperplane H ⊆ Rp so that all the points in X0 lie on one side of H and all the points in X1 lie on the other side of H. When p = 2, the hyperplane H is just a line and two linearly separable sets (black and red) are depicted in the figure below. Linearly separability
is an important concept in classification because if the training data from two classes are linearly separable, then it is always possible to construct a linear classifier with zero training errors. More formally, we have the following definition:
The sets X 0 and X 1 are linearly separable if there exists a vector w ∈ Rp and a real number a∈R such that w⊤x>a for all x∈X0 and w⊤x