CS计算机代考程序代写 algorithm Supervised Learning

Supervised Learning
20

Data
• Dataset:
𝒟𝑛={𝑥1,𝑦1 ,…,𝑥𝑛,𝑦𝑛 } 𝑥(𝑖) ∈ R𝑑 𝑦(𝑖) ∈ {+1, −1}

• 𝜑(𝑥): feature representation ∈ R𝑑 21

Hypotheses
• A hypothesis: 𝑦 = h 𝑥;𝜃 • h ∈ H (hypothesis class)
22

Loss function
• 𝐿 𝑔, 𝑎
𝑔 ∈ {+1, −1}
𝑎 ∈ {+1, −1}
• How bad was it that we predicted 𝑔 when 𝑎 is the true answer
23

Evaluating hypotheses
• Ideally: Small loss on new data 𝑛+𝑛′
Eh=1 ෍𝐿(h𝑥𝑖 ,𝑦𝑖) 𝑛′
𝑖=𝑛+1
test error
• What we can do (for now): Small loss on training data 𝑛
E𝑛h=1෍𝐿(h𝑥𝑖 ,𝑦𝑖) 𝑛
𝑖=1
training error
24

Learning algorithms
𝒟
• How to come up with learning algorithms: • Be a clever (or not so clever) human
• Use optimisation methods
LearningAlgorithm(H)
h
25