Last Updated on July 12, 2022

When we build and train a Keras deep learning model, the training data can be provided in several different ways. Presenting the data as a NumPy array or a TensorFlow tensor is a common one. Making a Python generator function and let the training loop to read data from it is another way. Yet another way of providing data is to use tf.data dataset.

In this tutorial, we will see how we can use tf.data dataset for a Keras model. After finishing this tutorial, you will learn:

  • How to create and use tf.data dataset
  • The benefit of doing so compared to a generator function

Let’s get started.

A Gentle Introduction to tensorflow.data API
Photo by Monika MG. Some rights reserved.

Overview

This article is split into four sections; they are:

  • Training a Keras Model with NumPy Array and Generator Function
  • Creating a Dataset using tf.data
  • Creating a Dataest from Generator Function
  • Data with Prefetch

Training a Keras Model with NumPy Array and Generator Function

Before we see how the tf.data API works, let’s review how we usually train a Keras model.

First, we need a dataset. An example is the fashion MNIST dataset that comes with the Keras API, which we have 60,000 training samples and 10,000 test samples of 28×28 pixels in grayscale and the corresponding classification label is encoded with integers 0 to 9.

The dataset is a NumPy array. Then we can build a Keras model for classification, and with the model’s fit() function, we provide the NumPy array as data.

The complete code is as follows:

Running this code will print out the following:

And also create the following plot of validation accuracy over the 50 epochs we trained our model:

The other way of training the same network is to provide the data from a Python generator function instead of a NumPy array. A generator function is the one with a yield statement to emit data while the function is running in parallel to the data consumer. A generator of the fashion MNIST dataset can be created as follows:

This function is supposed to be call with the syntax batch_generator(train_image, train_label, 32). It will scan the input arrays in batches indefinitely. Once it reaches the end of the array, it will restart from the beginning.

Training a Keras model with a generator is similar, using the fit() function:

Instead of providing the data and label, we just need to provide the generator as the generator will give out both. When data are presented as NumPy array, we can tell how many samples are there by looking at the length of the array. Keras can complete one epoch when the entire dataset is used once. However, our generator function will emit batches indefinitely so we need to tell when an epoch is ended, using the steps_per_epoch argument to the fit() function.

While in the above code, we provided the validation data as NumPy array, we can also use a generator instead and specify validation_steps argument.

The following is the complete code using generator function, which the output is same as the previous example:

Creating a Dataset using tf.data

Given we have the fashion MNIST data loaded, we can convert it into a tf.data dataset, like the following:

This prints the dataset’s spec, as follows:

We can see the data is a tuple (as we passed a tuple as argument to the from_tensor_slices() function), whereas the first element is in shape (28,28) while the second element is a scalar. Both elements are stored as 8-bit unsigned integers.

If we do not present the data as a tuple of two NumPy array when we create the dataset, we can also do it later. The following is creating the same dataset but first create the dataset for the image data and label separately before combining them:

This will print the same spec:

The zip() function in dataset is like the zip() function in Python in the sense that it matches data one-by-one from multiple datasets into a tuple.

One benefit of using tf.data dataset is the flexibility in handling the data. Below is the complete code on how we can train a Keras model using dataset, which the batch size is set to the dataset:

This is the simplest use case of using a dataset. If we dive deeper, we can see that a dataset is just an iterator. Therefore we can print out each sample in a dataset using the following:

The dataset has many functions built-in. The batch() we used before is one of them. If we create batches from dataset and print it, we have the following:

which each item we get from a batch is not a sample but a batch of samples. We also have functions such as map(), filter(), and reduce() for sequence transformation, or concatendate() and interleave() for combining with another dataset. There are also repeat(), take(), take_while(), and skip() like our familiar counterpart from Python’s itertools module. A full list of the functions can be found from the API documentation.

Creating a Dataset from Generator Function

So far, we saw how dataset can be used in place of a NumPy array in training a Keras model. Indeed, a dataset can also be created out of a generator function. But instead of a generator function that generates a batch as we saw in one of the example above, here we make a generator function that generates one sample at a time. The following is the function:

This function randomizes the input array by shuffling the index vector. Then it generates one sample at a time. Unlike the previous example, this generator will end when the samples from the array are exhausted.

We create a dataset from the function using from_generator(). We need to provide the name of the generator function (instead of an instantiated generator) and also the output signature of the dataset. This is required because the tf.data.Dataset API cannot infer the dataset spec before the generator is consumed.

Running the above code will print the same spec as before:

Such a dataset is functionally equivalent to the dataset that we created previously. Hence we can use it for training as before. The following is the complete code:

Dataset with Prefetch

The real benefit of using dataset is to use prefetch().

Using a NumPy array for training is probably the best in performance. However, this means we need to load all data into memory. Using a generator function for training allows us to prepare one batch at a time, which the data can be loaded from disk on demand, for example. However, using a generator function to train a Keras model means either the training loop or the generator function is running at any time. It is not easy to make the generator function and Keras’ training loop to run in parallel.

Dataset is the API that allows the generator and the training loop to run in parallel. If you have a generator that is computationally expensive (e.g., doing image augmentation at realtime), you can create a dataset from such generator function and then use it with prefetch(), as follows:

The number argument to prefetch() is the size of the buffer. Here we ask the dataset to keep 3 batches in memory ready for the training loop to consume. Whenever a batch is consumed, the dataset API will resume the generator function to refill the buffer, asynchronously in background. Therefore we can allow the training loop and the data preparation algorithm inside the generator function to run in parallel.

It worth to mention that, in the previous section, we created a shuffling generator for the dataset API. Indeed the dataset API also has a shuffle() function to do the same but we may not want to use it unless the datset is small enough to fit in memory.

The shuffle() function, same as prefetch(), takes a buffer size argument. The shuffle algorithm will fill the buffer with the dataset and draw one element randomly from it. The consumed element will be replaced with the next element from the dataset. Hence we need the buffer as large as the dataset itself to make a truly random shuffle. We can demonstrate this limitation with the following snippet:

The output from the above looks like the following:

Which we can see the numbers are shuffled around its neighborhood and we never see large numbers from its output.

Further Reading

More about the tf.data dataset can be found from its API documentation:

Summary

In this post, you have seen how we can use the tf.data dataset and how it can be used in training a Keras model.

Specifically, you learned:

  • How to train a model using data from NumPy array, a generator, and a dataset
  • How to create a dataset using a NumPy array or a generator function
  • How to use prefetch with dataset to make the generator and training loop run in parallel

Develop Deep Learning Projects with Python!

Deep Learning with Python

 What If You Could Develop A Network in Minutes

…with just a few lines of Python

Discover how in my new Ebook:

Deep Learning With Python

It covers end-to-end projects on topics like:

Multilayer PerceptronsConvolutional Nets and Recurrent Neural Nets, and more…

Finally Bring Deep Learning To

Your Own Projects

Skip the Academics. Just Results.

See What’s Inside



Source link

By GIL