fa3 + logistic reg to gradient boosting Flashcards
We can visualize the tree using the export_graph function from the tree module.
Group of answer choices:
True
False
False
In the decision tree, the region can be found by traversing the tree from the root and going left or right.
Group of answer choices
True
False
True
Decision tree is a model that learns a hierarchy of if/else questions, leading to a decision.
Group of answer choices
True
False
True
The .dot file format is a _____ file format for storing graphs.
TEXT
In the decision tree, the ______ represents the whole dataset.
Group of answer choices
Terminal Nodes
Edges
Root
Conditions
Root
The .dot file format is an image file format for storing graphs.
Group of answer choices
True
False
True
Decision trees in scikit learn are implemented in ________ and DecisionTreeClassifier classes.
Group of answer choices: DecisionRegressorTree
TreeDecisionRegressor
RegressorDecisionTree
DecisionTreeRegressor
DecisionTreeRegressor
Which is not true about Random Forest?
Group of answer choices
Not in the options
Less memory usage.
Less burden or parameter tuning.
As many trees are created, detailed analysis is difficult.
Poor performance for large and sparse data.
Less memory usage.
To build a random forest model, you need to decide on the __________ to build.
Group of answer choices
Depth of the tree Height of tree
Number of trees
Root
Node of the tree
Number of trees
The _______ are methods that combine multiple machine learning models to create more powerful models.
ENSEMBLES
In the decision tree, the terminal nodes represent the whole dataset.
Group of answer choices
True
False
False
In the decision tree, the sequence of if/else questions are called qualifiers.
Group of answer choices
True
False
False
Which is not true about Random Forest?
Group of answer choices
Reduces underfitting by averaging trees that predict well.
Reduces overfitting by averaging trees that predict well.
Selects candidate features at random when splitting nodes.
Randomly selects some of the data when creating a tree.
Reduces underfitting by averaging trees that predict well.
What are the parameters for Gradient Boosting?
a. n_estimators, learning rate
b. n_estimators, max_features
c. n_estimators, learning rate, max_depth
d. n_estimators, max_features, max_depth
c
Gradient boosting is used when you need to take more performance in random forests.
Group of answer choices
True
False
True
In the decision tree, the sequence of if/else questions are called ______.
Group of answer choices
Qualifiers
Condition
Tests
Nodes
Tests
Decision trees in scikit learn are implemented in DecisionTreeRegressor and _______ classes.
Group of answer choices
DecisionClassifier
TreeDecisionClassifier
DecisionTreeClassifier
DecisionClassifierTree
DecisionTreeClassifier
We can visualize the tree using the ______ function from the tree module.
export_graphviz
Two most common linear classification algorithms:
Logistic Regression
Linear Support Vector Machines
Logistic Regression, implemented in where
linear_model.LogisticRegression
Linear Support Vector Machines (Linear SVMs), implemented in where
svm.LinearSVC
SVC stands for?
support vector classifier
______ is a classification algorithm and not a regression algorithm, and it should not be confused with LinearRegression
LogisticRegression
the trade-off parameter detemrins the strength of the regularizaiton, called _____
C
Higher values of C correspond to _____
LESS REGULARIZATION
When you use a high value of the parameter C, LogisticRegression and LinearSVC will _______
try to fit the training set as best as possible
low values of the parameter C, the models put more emphasis on _______
finding a coefficient vector (w) that is close to zero
Using low values of C will cause the algorithms to try to adjust to the _____ of data points
“majority”
using a higher value of C stresses the importance that each ______ be classified correctly
individual data point
_______ are a family of classifiers that are quite similar to the linear models
Naive Bayes classifiers
In Naive Bayes, ____is faster than linear classifier
Training Speeds
In Naive Bayes, _____ performance is slightly lower
Generalization
The reason that Naive Bayes models are so efficient is that they______ and collect simple per-class statistics from each feature
learn parameters by looking at each feature individually
The reason that Naive Bayes models are so efficient is that they learn parameters by looking at each feature individually and _______
collect simple per-class statistics from each feature
3 Kinds of Naive Bayes Classifier in Scikit-learn:
GaussianNB
BernoulliNB
MultinomialNB
GuassianNB -> ____ data
Continuous
BernoulliNB -> ____ data, ___ data
Binary data, Text data
MultinomialNB -> ____ data, ___ data
Integer count data, text data
In Naive Bayes, it controls _____
model complexity with alpha parameter
In Naives Bayes, _____ by adding virtually positive data as much as alpha
Smooth statistics