Week 2 Flashcards
What is linear regression with multiple variables?
They are also known as “multivariate linear regression”
Notation: n
Number of features.
Notation: x_i
Input (features) of ith training example
Notation: x_ji
Value of feature j in ith training example.
What is the hypothesis function?
h_t(x)=t^Tx (Theta - transposed times x)
Is theta a vector?
Yes.
What are parameters?
Theta
How do you update the thetas?
theta_j:=theta_j-alpha*dJ(theta)/dtheta_j
What is feature scaling?
Feature scaling is normalizing features
What does feature scaling do?
Feature scaling speeds up gradient descent. It puts inputs roughly in same range.
What does theta do on smaller ranges?
Theta descends quickly on smaller ranges.
What does theta do on larger ranges?
Theta descents slowly on larger ranges.
Where do you put inputs variables into ranges ideally?
-1<=x_i<=1 or -0.5<=x_i<=0.5
What is the definition of feature scaling?
Feature scaling involves dividing the input values by the range of the input variable, resulting in a new range of just one.
What is the definition of mean normalization?
Mean normalization involves subtracting the average value of the input variable from the values for that input variable resulting in a new average for the input variable of just zero.