程序代写代做代考 html deep learning 2020/8/14 https://www.cse.unsw.edu.au/~cs9444/20T2/quiz/ans/quiz3_answers.html

2020/8/14 https://www.cse.unsw.edu.au/~cs9444/20T2/quiz/ans/quiz3_answers.html
COMP9444 Neural Networks and Deep Learning Quiz 3 (Convolutional Networks)
This is an optional quiz to test your understanding of the material from Weeks 3 and 4.
1. Sketch the following activation functions, and write their formula: Sigmoid, Tanh, ReLU.
= 1/(1 + exp(- )) = tanh( ) = 0, if ¡Ü 0
= (e – e-x)/(e + e-x)
= , if > 0
2. Explain how Dropout is used for neural networks, in both the training and testing phase.
During each minibatch of training, a fixed percentage (usually, one half) of nodes are chosen to be inactive. In the testing phase, all nodes are active but the activation of each node is multiplied by the same percentage that was used in training.
3. Explain what is meant by Overfitting in neural networks, and list four different methods for avoiding it.
Overfitting is where the training set error continues to reduce, but the test set error stalls or increases. This can be avoided by
a. reducing the number of neurons or connections in the network b. early stopping, with a validation set
c. weight decay d. dropout
4. Write the formula for the Softmax loss function
softmax: E = -( – log ¦² exp( )), where is the correct class.
5. Write the formula for activation of the node at location ( ) in the th filter of a
Convolutional neural network which is connected by weights to all nodes in an
window from the channels in the previous layer, assuming bias weights are included and the activation function is (). How many free parameters would there be in this layer?
=()
https://www.cse.unsw.edu.au/~cs9444/20T2/quiz/ans/quiz3_answers.html 1/2
n+k,m+jlVn,m,liKN