! Supervised Learning: Regression Flashcards

1
Q

Regression

A
  • finding & predicting relationship between indepent & continuous output/dependent variables
  • e.g. predict price of car given set of features (age, brand)
  • supervised ML
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Regression - Tasks

A
  • Linear Regression
  • Neural Networks
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Linear Regression - Goal

A
  • learn a linear model that we can use to predict a new y given a previously unseen x with as little error as possible
  • parametic
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Linear Regression - Method

A
  • Linear equation: y’ = ß0 + ß1x1 + … + ßixi
  • estimating coefficient(s) (weights wi, bias b) so that
  • predicts continous variable (y’) based on other variable(s) (xi) in best way
  • Finding straight line minimizing distance between predicted & actual output
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Linear Regression - Steps

A
    1. Define cost / loss function: measures inaccuracy of models prediction
    1. Find parameters minimizing loss function: make model as accurate as possible -> Gradient Descent
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Linear Regression - Steps - Gradient Descent

A
  • method to find minimum of model’s (y’ = ß0 + ß1x1 + … + ßixi) loss function by iteratively process
  • Function f(ß0, ß1) = z <- focus on ß1 & ß0 (other variables of cost f given)
  • Guess ß0 & ß1
  • [dz/dβ0, dz/dβ1] <- Get partial derivatives of loss function with respect to each beta (= how much total loss is increased or decreased if increase ß0 or ß1 by very small amount)
  • Adjust ß0 & ß1 accordingly: if partial derivative < 0 -> increase ß1; > 0 -> decrease ß2
  • Repeat 3 & 4 until partial derivative = 0
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Linear Regression - Steps - define cost function

A
  • a) Take average of
  • b) Square (-> no negative numbers penalizing large differences)
  • c) the sum of all differences between each data point (yi) & model predictor (ß1xi + ßo)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Linear Regression - Solution over- & underfitting

A
    1. Use more training data
    1. Use regularization = penalty to loss function when model assigns too much power to 1 feature or to too many features (c + Lambda (sum of ßi^2) = hyperparameter -> higher: more penalty)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Regression - Types

A

simple = 1 independent variable <- > Multiple > 1 independent variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Linear Regression - Assumptions / Dissdvantages

A
  • variables = quantitative (Categorical -> binary); measure at continous level
  • No significant outliers
  • Residuals (errors) of best-fit regression line = normal distribution
  • relationship between dependent & independent variables = linear
  • all observations = independent
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Supervised Learning Goal

A
  • find relation btw input & output data based on already knows answers (labeled data)
  • apply to predict outcomes of new variables
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Supervised application

A

image classification (dog or cat?), fraud detection, spam filtering

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Logistic Regression

A
  • predict discrete class based on probability
  • put linear regression f(x) in sigmoid function P(Y=1) = 1 / (1+e^-lf(x)) -> result = probability btw 0 & 1
  • threshold: calss decision based on tolerance false + / -
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Logistic Regression - Logg-odds-ratio

A
  • gonna die ln(p to (1-p) ) log-ods
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Logistic Regression - Con

A
  • risk of overfitting if large number of independent variables
  • need sample of even categories
How well did you know this?
1
Not at all
2
3
4
5
Perfectly