CS代考 COMP90049 Introduction to Machine Learning (Semester 1, 2022) Workshop: wee

School of Computing and Information Systems The University of Melbourne
COMP90049 Introduction to Machine Learning (Semester 1, 2022) Workshop: week 10
1. Consider the two levels deep network illustrated below. It is composed of three perceptron. The two perceptron of the first level implement the AND and OR function, respectively.
Determine the weights θ11, θ21 and bias θ01 such that the network implements the XOR function. The initial weights are set to zero, i.e., θ01 = θ11 = θ21 = 0, and the learning rate 𝜂 (eta) is set to 0.1.

Copyright By PowCoder代写 加微信 powcoder

• The input function for the perceptron on level 2 is the weighted sum (Σ) of its input.
• The activation function f for the perceptron on level 2 is a step function:
• Assume that the weights for the perceptron of the first level are given.
2. Consider the following multilayer perceptron.
The network should implement the XOR function. Perform one epoch of backpropagation as introduced in the lecture on multilayer perceptrons.
• The activation function f for a perceptron is the sigmoid function:
• The thresholds are not shown in the network. The threshold nodes are set to -1.
• Use the following initial parameter values:
𝜃($) =2 𝜃($) =−1 𝜃(‘) =−2 #$ #’ #$
𝜃($) =6 𝜃($) =8 𝜃(‘) =6 $$ $’ $$
𝜃($) =−6 𝜃($) =−8 𝜃(‘) =−6 ‘$ ” ‘$
• The learning rate is set to η = 0.7

i. Compute the activations of the hidden and output neurons.
ii. Compute the error of the network.
iii. Backpropagate the error to determine ∆θij for all weights θij and updates the weight θij.

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com