Lecture 13 & Workshop 3: Logistic Regression with Regularisation Flashcards
The model is built based on the notion underlying y=1..
e.g. y=1 means tumor is malignant
y = 0 means tumor is benign.
=> model is trying to predict if tumor is ?.
The model is built based on the notion underlying y=1..
e.g. y=1 means tumor is malignant
y = 0 means tumor is benign.
=> model is trying to predict if tumor is malignant.
Logistic regression is a special case of linear regression where y = {??}
Logistic regression is a special case of linear regression where y = {0, 1}
Model for logistic regression:
h(x) = g(??) using logit function g
h(x) can be interpreted as the estimated ? that ?? on input x.
Model for logistic regression:
h(x) = g(theta^T * x) using logit function g
h(x) can be interpreted as the estimated probability that y=1 on input x.
Logit/ sigmoid function :
g (theta^T * x ) = 1 / [1 + ??]
Logit/ sigmoid function :
g (theta^T * x = 1 / [1 + e^(-theta^T *x)]
to have a ? cost / best fitted line of h(x):
if y=1 => we want h(x) is close to ?
if y=0 => we want h(x) is close to ?.
to have a low cost / best fitted line of h(x):
if y=1 => we want h(x) is close to 1
if y=0 => we want h(x) is close to 0.
Logistic Cost function:
when y=1:
J(theta) = (-1/?) * Sum [??? ]
Logistic Cost function:
when y=1:
J(theta) = (-1/m) * Sum [y * log ( h(x) ) ]
m: number of observations
Logistic Cost function when y = 0:
J (theta) = (-1/m) * Sum [???]
Logistic Cost function when y = 0:
J (theta) = (-1/m) * Sum [log ( 1 - h(x) )]
m: number of observations