The first task to develop new recognition models is to gather and prepare the training dataset. Building pipelines to let the data flow properly during heavy training phases used to be an art, but TensorFlow recent features made it quite straightforward to fetch and pre-process complex data, as demonstrated in the first notebooks of this Chapter 7. Oftentimes, however, training data can simply be unavailable. The remaining notebooks tackle these scenarios, presenting a variety of solutions.
(Reminder: Notebooks are better visualized with nbviewer
: click here to continue on nbviewer.jupyter.org
.)
- 7.1 - Setting up Efficient Input Pipelines with
tf.data
- Harness the latest features of the
tf.data
API to set up optimized input pipelines to train models.
- Harness the latest features of the
- 7.2 - Generating and Parsing TFRecords
- Discover how to convert complete datasets into TFRecords, and how to efficiently parse these files.
- 7.3 - Rendering Images from 3D Models
- Get a quick overview of 3D rendering with Python, using OpenGL-based
vispy
to generate a variety of images from 3D data.
- Get a quick overview of 3D rendering with Python, using OpenGL-based
- 7.4 - Training a Segmentation Model on Synthetic Images
- Use a pre-rendered dataset of synthetic images to train a model, and evaluate the effects of the realism gap on its final accuracy.
- 7.5 - Training a Simple Domain Adversarial Network
- Discover and implement a famous domain adaptation method: DANN.
- 7.6 - Applying DANN to Train the Segmentation Model on SYnthetic Data
- Apply the previous DANN method to improve the performance of our segmentation model suffering from the realism gap.
- 7.7 - Generating Images with VAEs
- Build your first generative neural network, a simple *Variational Auto-Encoder (VAE), to create digit images.
- 7.8 - Generating Images with GANs
- Discover another famous type of generative neural models: the *Generative Adversarial Networks (GANs).
- cityscapes_utils.py: utility functions for the Cityscapes dataset (code presented in notebook 6.4).
- fcn.py: functional implementation of the FCN-8s architecture (code presented in notebook 6.5).
- keras_custom_callbacks.py: custom Keras callbacks to monitor the trainings of models (code presented in notebooks 4.1 and 6.2).
- plot_utils.py: utility functions to display results (code presented in notebook 6.2).
- renderer.py: object-oriented pipeline to render images from 3D models (code presented in notebook 7.3).
- synthia_utils.py: utility functions for the SYNTHIA dataset (code presented in notebook 7.4).
- tf_losses_and_metrics.py: custom losses and metrics to train/evalute CNNs (code presented in notebooks 6.5 and 6.6).
- tf_math.py: custom mathematical functions reused in other scripts (code presented in notebooks 6.5 and 6.6).