Terminology Week 1 Flashcards
Describe linear regression.
A technique to find a relation between one or many input variables and an output variable.
A technique to find a line to best fit a set of points.
Describe a linear function?
y = w*x + b y = predicted value (predicated label) x = input variable (feature) w = weight (weight vector, gives the slope) b = initial bias
Describe the notion of loss.
How well does our line predict an example.
How far away is the data point from the line.
Loss is a number indicating how bad the model’s prediction was on a single example.
We can describe the loss as the difference between the prediction of the value (a dot on the line) and the actual value (the example).
How can you call the L2 loss function?
Also called squared loss.
How can you call the squared loss function?
L2 loss function
Describe the L2 loss function? And how is it calculated for a data set?
It is the square of the difference between the predicated value (predication) and the true value (label).
We don’t care about the loss of a single value. We want to minimise the loss across our entire data set.
Sum of all squared differences (predicated - example)
Often devided by the number of examples in the data s = average square loss = Mean square error = MSR
What doe Infer mean?
To predict
What is linear regression (google definition)?
A type of regression model that outputs a continuous value from a linear combination of input features.
What is Inference (google definition)?
In machine learning, often refers to the process of making predictions by applying the trained model to unlabeled examples. In statistics, inference refers to the process of fitting the parameters of a distribution conditioned on some observed data.
What does training a model mean?
Training a model simply means learning (determining) good values for all the weights and the bias from labeled examples. In supervised learning, a machine learning algorithm builds a model by examining many examples and attempting to find a model that minimizes loss; this process is called empirical risk minimization.
Describe MSE.
Mean square error = average squared loss