Chapter 13 Save Your Models For Later With Serialization Flashcards
Keras separates the concerns of saving your model architecture and saving your model weights. Model weights are saved to … format. This is a grid format that is ideal for storing multi-dimensional arrays of numbers. The model structure can be described and saved (and loaded) using two different formats: … and…. P 97
HDF5, JSON, YAML
Keras provides the ability to describe any model using JSON format with a …function. This can be saved to file and later loaded via the … function that will create a new model from the JSON specification. P 98
to_ json(), model_ from_json()
The weights are saved directly from the model using the … function and later loaded using the symmetrical … function. P 98
save_ weights(), load_ weights()
What does the below code do? P 98
import numpy import pandas from keras.models import Sequential from keras.layers import Dense from sklearn.datasets import load_breast_cancer, load_iris,load_diabetes from sklearn.model_selection import GridSearchCV from keras.models import model_from_json seed = 7 numpy.random.seed(seed) import tensorflow tensorflow.random.set_seed(seed) #for reproducibility dataset = load_breast_cancer(as_frame=True) X = dataset.data Y = dataset.target model = Sequential() model.add(Dense(12, input_dim=30,activation= "relu" )) model.add(Dense(8, activation= "relu" )) model.add(Dense(1, activation= "sigmoid" )) model.compile(loss= "binary_crossentropy" , optimizer= "adam" , metrics=[ "accuracy" ]) # Fit the model model.fit(X, Y, epochs=150, batch_size=10, verbose=0) # evaluate the model scores = model.evaluate(X, Y, verbose=0) print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100)) # serialize model to JSON model_json = model.to_json() with open("model.json", "w") as json_file: json_file.write(model_json) # serialize weights to HDF5 model.save_weights("model.h5") print("Saved model to disk") # later... # load json and create model with open("model.json" , "r" ) as json_file: loaded_model_json = json_file.read() loaded_model = model_from_json(loaded_model_json) # load weights into new model loaded_model.load_weights("model.h5") print("Loaded model from disk") # evaluate loaded model on test data loaded_model.compile(loss= "binary_crossentropy" , optimizer= "rmsprop" , metrics=[ "accuracy" ]) score = loaded_model.evaluate(X, Y, verbose=0) print ("%s: %.2f%%" % (loaded_model.metrics_names[1], score[1]*100))
It saves the nn model and the weights in 2 separate files with json and h5 formats respectively. then loads the saved nn model and use it.
Note the use of .compile after loading an already saved model, before using it to evaluate the dataset
What code should we use for having reproducible results in NNs? External
import tensorflow tensorflow.random.set_seed(seed)