Chapter 9-11 Use Keras Models With Scikit-Learn For General Machine Learning Flashcards

Classification Problems

1
Q

The Keras library provides a convenient wrapper for deep learning models to be used as classification or regression estimators in Scipy (previously Scikit-Learn). True/False P 67

A

True

The role of the KerasClassifier/KerasRegressor is to work as an adapter to make the Keras model work like a MLPClassifier/MLPRegressor object from scikit-learn.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How are the KerasClassifier and KerasRegressor written in code? P 68

A
pip install scikeras
import scikeras
from scikeras.wrappers import KerasRegressor,KerasClassifier
def create_model():
  # create model
  model = Sequential()
  model.add(Dense(12, input_dim=8, keras_initializer= "uniform" , activation= "relu" ))
  model.add(Dense(8, keras_initializer= "uniform" , activation= "relu" ))
  model.add(Dense(1, keras_initializer= "uniform" , activation= "sigmoid" ))
  return model
# fix random seed for reproducibility
seed = 7
numpy.random.seed(seed)
# load pima indians dataset
dataset = numpy.loadtxt("pima-indians-diabetes.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]
# create model
model = KerasClassifier(model=create_model, epochs=150, batch_size=10, verbose=0,loss= "binary_crossentropy", optimizer= "adam", metrics=[ "accuracy" ])
# evaluate using 10-fold cross validation
kfold = StratifiedKFold(n_splits=10, shuffle=True, random_state=seed)
results = cross_val_score(model, X, Y, cv=kfold)
print(results.mean())
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the benefit of using kerasclassifier/kerasregressor wrappers? P 69

A

You can see that when the Keras model is wrapped that estimating model accuracy can be greatly streamlined, compared to the manual enumeration of cross validation folds (We have to create a for loop P 65 and manually evaluate each fold of the training and test sets).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How do we use Gridsearch for NNs? P 69

A

load dataset

1- Defining the nn model and compiling it using a function
2- Wrapping it using SciKeras
3- Using a Gridsearch as we noramally use

import numpy
import pandas
from keras.models import Sequential
from keras.layers import Dense
from sklearn.datasets import load_breast_cancer, load_iris
from sklearn.model_selection import GridSearchCV
dataset = load_breast_cancer()
X = dataset.data
Y = dataset.target

def create_model( ):
    # create model
    model = Sequential()
    model.add(Dense(12, input_dim=30,  activation= "relu"))
    model.add(Dense(8, activation= "relu" ))
    model.add(Dense(1,  activation= "sigmoid" ))
    return model

model = KerasClassifier(model=create_model, verbose=0,random_state=0, loss= "binary_crossentropy")
# grid search epochs, batch size and optimizer
optimizers = [ "rmsprop" , "adam" ]
epochs = numpy.array([50,100])
batches = numpy.array([5])
param_grid = dict(optimizer=optimizers, epochs=epochs, batch_size=batches)
grid = GridSearchCV(estimator=model, param_grid=param_grid,cv=10,scoring='roc_auc')
grid_result = grid.fit(X, Y)
# summarize results
print("Best: %f using %s" % (grid_result.best_score_, grid_result.best_params_))
print("Results: params{} \n mean test scores {}".format(grid_result.cv_results_["params"],grid_result.cv_results_["mean_test_score"]))
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How can we use Scikeras to create a model for multiclass datasets? P 76

A
# Multiclass Classification with the Iris Flowers Dataset
import numpy
import pandas
from keras.models import Sequential
from keras.layers import Dense
from keras.utils import np_utils
from sklearn.model_selection import cross_val_score
from sklearn.model_selection import KFold
from sklearn.preprocessing import LabelEncoder
from sklearn.pipeline import Pipeline
from sklearn.datasets import load_breast_cancer, load_iris
from sklearn.metrics import confusion_matrix, plot_confusion_matrix
import matplotlib.pyplot as plt
from sklearn.metrics import precision_score
import sklearn.metrics
# load dataset
dataset = load_iris()
X = dataset.data
Y = dataset.target
# encode class values as integers
encoder = LabelEncoder()
encoder.fit(Y)
encoded_Y = encoder.transform(Y)
# convert integers to dummy variables (i.e. one hot encoded)
dummy_y = np_utils.to_categorical(encoded_Y)
# define baseline model
def baseline_model():
  # create model
  model = Sequential()
  model.add(Dense(4, input_dim=4 , activation= "relu" ))
  model.add(Dense(4,  activation= "relu" ))
  model.add(Dense(3 , activation= "softmax" ))
  return model
estimator = KerasClassifier(model=baseline_model, epochs=200, batch_size=10, verbose=0,random_state=0,loss= "categorical_crossentropy",optimizer= "adam")
kfold = KFold(n_splits=10, shuffle=False)
results = cross_val_score(estimator, X, dummy_y, cv=kfold,scoring='accuracy')
results.mean()
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How can we use standard scaler in the NN? P 82

A

load dataset

import numpy
import pandas
from keras.models import Sequential
from keras.layers import Dense
from sklearn.model_selection import cross_val_score
from sklearn.model_selection import KFold
from sklearn.pipeline import Pipeline
from sklearn.datasets import load_breast_cancer, load_iris
from sklearn.preprocessing import StandardScaler
dataset = load_breast_cancer()
X = dataset.data
Y = dataset.target
# define baseline model
def baseline_model():
  # create model
  model = Sequential()
  model.add(Dense(4, input_dim=30 , activation= "relu" ))
  model.add(Dense(4,  activation= "relu" ))
  model.add(Dense(1 , activation= "sigmoid" ))
  return model
estimator = Pipeline(steps=[('scaler',StandardScaler()),
                            ('model',KerasClassifier(model=baseline_model, epochs=100, batch_size=10, verbose=0,random_state=0,loss= "binary_crossentropy",optimizer= "adam"))])
kfold = KFold(n_splits=5, shuffle=True, random_state=0)
results = cross_val_score(estimator, X, Y, cv=kfold,scoring='average_precision')
results.mean()
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

There are many things to tune on a neural network, such as the weight initialization, activation functions, optimization procedure and so on. One aspect that may have an outsized effect is … P 83

A

the structure of the network itself called the network topology

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How can we improve the network’s performance by making it smaller? P 83

A

In the first hidden layer, we can decrease the number of output nodes, for example

model.add(Dense(30, input_dim=60, init= normal , activation= relu ))

which puts pressure on the network during training to pick out the most important structure in the input data to model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How does adding more hidden layers help the neural network model performance? P 84

A

In the below example, the idea here is that the network is given the opportunity to model all input variables before being bottlenecked and forced to halve the representational capacity.

model.add(Dense(60, input_dim=60, init= normal , activation= relu ))
model.add(Dense(30, init= normal , activation= relu ))
How well did you know this?
1
Not at all
2
3
4
5
Perfectly