Chapter 12 Project: Regression Of Boston House Prices Flashcards
How can we create a kerasregressor model with a standard scaler? P 87
import numpy import pandas from keras.models import Sequential from keras.layers import Dense from keras.utils import np_utils from sklearn.model_selection import cross_val_score from sklearn.model_selection import KFold from sklearn.preprocessing import LabelEncoder from sklearn.pipeline import Pipeline from sklearn.datasets import load_breast_cancer, load_iris,fetch_california_housing, fetch_openml from sklearn.preprocessing import StandardScaler dataset = fetch_california_housing(as_frame=True) X = dataset.data Y = dataset.target def baseline_model(): model=Sequential() model.add(Dense(8,input_dim=8,activation='relu')) model.add(Dense(1)) return model estimator=Pipeline(steps=[('scaler',StandardScaler()),('model',KerasRegressor(model=baseline_model, epochs=10, batch_size=100, verbose=0,random_state=0,loss= "mean_squared_error",optimizer= "adam"))]) kfold = KFold(n_splits=3) results = cross_val_score(estimator, X, Y, cv=kfold,scoring='neg_mean_squared_error') print("Baseline: %.2f (%.2f) MSE" % (results.mean(), results.std())) #MSE is negative when returned by cross_val_score
One way to improve the performance of a neural network is to add more layers. This might allow the model to extract and recombine higher order features embedded in the data. True/False? P 92
True
Another approach to increasing the representational capacity of the model is to create a wider network. True/False? P 93
True. For example: Keeping a shallow network architecture and nearly doubling the number of neurons in the one hidden layer can help
It would be hard to guess that a wider network would outperform a deeper network on a problem. This demonstrates the importance of empirical testing when it comes to developing neural network models. True/False P 94
True