Machine Learning Flashcards
Which method always gives the same global optimum?
Logistic regression
Which method is not iterative?
Naïve Bayes
Which of the following is a feature selection technique?
Stepwise regression
Which method uses a gradient descent algorithm to find the optimum?
Logistic regression
Which method is most prone to overfitting?
Neural network
Which method is used in deep learning?
Neural network
Which model uses the following formula :
P() = P() * P() / sum[P() p()]?
Naive Bayes or SVM
You have a dataset with 10,000 data and 20,000 features. Which method that splits the data into hyperplanes is most suited to classify new data?
SVM
Why, in essence, do we try to find a posteriori probabilities?
To minimize the error rate
Which of the following is not an issue for decision trees?
Computationally intensive