superusest.IE RNNcf
definitions Linear Regression
gate stockade gnament descent 1 NormalEquations
Superuseblearning Preachy
h X 4
Giver i TrainingSET
Cx cy Lemay x tX yheu
De find good h X Yhypothesis
Image TEXT
HouseDara
containseat
IS HATESreecu
Price
thisJOB of
trainingalgorithm
WE OSE ch ON new DATA
x call this prediction WE
Are veryInstres
IN X EfThaing SET
If g IS
Y
DISCRETE classify.vn Continuous 3 Regression
Is
Cxampledant House Prices
f
SQftPRuE_
Ik 400
XX Xx
xx
2100
2500 Goo
800
T
hi
1127
soft
t longago 80ft Price
How do we represent h
had
Oo t 0 X Coffinfn
c
Xl Xz Xs
Size
Beoron
Lotso
Druce 400 900
45k X2 2500 3 30k
X
hasO_O tQX t02 t
y oEyXj Go I
IE
PARAMETERS i
Nd Xo identically 1
2104
XE
M examples s
AND 674 rows
z
INPUTS FEATURES
Ky Is Xli yT
OutputITAyet a training Example
Is
h
aan Lot size
Em
Eh Example d features
i runs l
0 are 01 dimensional
a
nexamples
X
a
slope Q
Xeno too xx
of
f
Xx
80ft
hold yo0jXg WANT to Goose 0 St holx Ty
Ebert Io I II that t y 5 Costfunction least squares MIN JO
leongAgo
GrablertDescent 1
START O_0
no nannon on zero REDUCE usingGradient
to
I
t J’s
I cos
Konex
Vocalmm globalmu
are
off
270
0 O J
learning RAE
t
0
Oj a IIChok
y
0ft 222,510
i.EE hocxciy ycisj
g i
d
SOMETIMES WRITE As t Ct veeton notation
a 7 hoc y ga1 ci
hocx y 22ozhotel
holx fox toXEt
2ha
too
Xj
aI
BATH
VERSUS STOCHASTIC
t Ct
MINBARU
head gayxcis
ben points And Estimate gnament i c’b D
x9
MiniBAereet
1 2
one.DE il
Ranoomleyselect
o Scale
g KED hold
2 AND 2b dilterently
a
Pick
Oct
b points
NOISIER BIT muckfasteen
faster Imagine it mainly SET contains 100comes al SAMEpoint
Not As Ridiculous As it seems NEAR comes How doyou choose B Sadly Whatever works
TRADES
y
140mmol Equation o.o
11242
a aa II fA R
eaten E
A
NOI
WE
5101
WANT to find minimum
JL 0 8 5107 ftp.dt
II Choke J c IR
fTI
y fog Hu ooo the y5Cxo y
Tosto XTXo xTg o Gtx OPTIMALVALUE
I
yT Design matrix
X
x
xty