Analytical Regression
RMIT Classification: Trusted
RMIT Classification: Trusted
Solving for Weights
The minimum of the the loss function occurs at the point where the partial derivatives are zero Thus % can be solved for analytically using normal equations
Solve for &
/ & % =⋯=0(foreveryj)
&% =(%&+*%++
,&% =⋯=0 ,%
1″
&%’,%(,…,%) =23h, 5* −7(*)
&
/,#
Simultaneously solve for %’, %(, … , %)
*+(
COSC2673 | COSC2793 Week 2: Regression 33
RMIT Classification: Trusted
Solving for Weights
Caveats:
This analytical approach effectively gives an matrix equation:
7 = 8&
• Vectorofoutputsfromtrainingexamples:,={.(%)+⋯+.(-)}
• Matrixofinputsfromtrainingexamples:$
To solve requires inverting: & = 8<&7
• Maynothaveasolution
• Butcanbeapproximated,throughothertechniques,notdiscussedinthiscourse
COSC2673 | COSC2793 Week 2: Regression 34
RMIT Classification: Trusted
Solving for Weights
Approximated with the analytical “equivalent” of the gradient descent loss function:
Quadratic Minimisation problem23/||7 − 8&||' $
• Findvaluesof&whichminimisestheabove
Has a unique solution:
If the attributes are independent
&= 8=8<&8=7
COSC2673 | COSC2793 Week 2: Regression 35