Epochs, Batches, Iterations Flashcards
When are epochs, batches, batch sizes and iterations appropriate to use?
Only when the dataset is large and we need to break down the dataset into smaller chunks and feed those chunks to the neural networks one by one
Can you describe what an Epoch is?
when the entire dataset is passed forward and backward through the neural network only once and we can use multiple epochs to help our model generalize better
Batch and Batch Size?
The process of dividing larger datasets into smaller batches and feed those batches into the neural network. The batch size is the TOTAL number of training examples in a batch
Is there a magic number of epochs that can be used to help our model generalize better?
No because it is entirely dependent on the dataset as some datasets can include millions of examples and trying to pass a large dataset at once becomes difficult
What other approaches can we take if our dataset is too large and an epoch will not work?
Batch and Batch Sizes as we can divide a large dataset into smaller “batches” and feed those batches into the neural network where the batch size is the TOTAL number of training examples in a batch
What is an iteration?
The number of batches needed to complete one epoch so the number of batches is equal to the number of iterations for one epoch
Can you describe the relationship between an epoch, batch, and an iteration?
Suppose you have a dataset of 34000 training examples and you divide the dataset into batches of 500, to complete 1 epoch, it would have taken 68 iterations