Lecture 11: Machine Learning for BDA Flashcards
Machine learning
=> Supervised learning
or => Unsupervised learning
Machine learning
=> Supervised learning
or => Unsupervised learning
Supervised Learning:
Training data set => Learning algorithm => hypotheses.
Data input => hypotheses => estimate output
Supervised Learning:
Training data set => Learning algorithm => hypotheses.
Data input => hypotheses => estimate output
(Lecture 11 - slide 27)
?? is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the gradient of the function at the current point, because this is the direction of steepest descent.
Gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the gradient of the function at the current point, because this is the direction of steepest descent.
google to find images