COSC2673/COSC2793 | Machine Learning
Tutorial | Week 04
Presentations
If you are scheduled for a presentation in the lab this week, Please read Canvas Announcement regarding oral presentation and submit the recorded presentation before the due date.
If you are not presenting, please be polite and respectful to those giving their talk today. The presentations are not Just for assessment. They are an excellent opportunity to learn about the wider field of machine learning.
Tutorial Questions
1. Last week we learnt about linear regression. Contrast this with Logistic Regression. What are the differences, in problem solved, in model adopted and other characteristics
2. Write out the logistic regression loss function. Explain what each term means, and explain what is the intuition behind it, focusing on what happens when the value of the loss function is 0. How is it different from the linear regression one?
3. Write out the linear regression regularisation loss function (ridge regression from lectures). Intuitively, what does the regularisation term do in terms of the complexity of the trained model?
4. Is regularisation (in regression) necessary? Under what circumstances would you use, and/or not use regularisation?
5. Derive the gradient descent update rule for univariate logistic regression.
1