This course will take you through the steps that a machine learning engineer would take to train and deploy a deep learning model. We will start the course by defining an end goal that we want to achieve. Then, we will download a dataset that will help us achieve that goal. We will build a Convolutional Neural Network using Tensorflow with Keras and then we will train this network on Google AI-Platform. After saving the best trained model, we will deploy it as a web app using Flask and Google Cloud Run. Throughout the course, we will be using Docker to containerize our code.
Building your deep learning model
-
1Introduction and course content
Introduction to the course and an outline of the course sections and content.
-
2Operating System, Python IDE and Docker
Setting up your local machine with the right software.
-
3Setting up Google Cloud Platform
Creating a google cloud account with 300$ free cloud credit.
-
4Setting up a VS code folder and creating a virtual environment using Virtualenv
Creating a setup for our code on visual studio code.
-
5Python packages we will be using and how to install them
Installing python packages that we will use during the course from a requirements.txt file that I will provide you.
-
6Testing your installation and setup
Testing our setup by running a few commands and some basic import code.
Introduction to Google Cloud Storage
-
7How do machine learning or deep learning projects usually work?
A description of how machine learning projects work from a high level perspective.
-
8What is our end goal?
Defining our end goal before we start diving into building our machine learning pipeline.
-
9Downloading the dataset
Where to get the dataset that we will be using throughout the course.
-
10Data exploration : splitting data into category folders
How to organize our dataset by splitting the data into folders that contain images of the same class.
-
11Data exploration : visualizing random samples from the dataset
Adding a function that can help us visualize random samples from our dataset.
-
12Data exploration : getting insights about widths and heights of images
Getting some insights from our dataset by computing mean and medium values of widths and heights of images.
-
13What to consider when building a neural network for our task?
Things that you should consider when building a neural network to solve a specific problem.
-
14Building the neural network architecture using Keras and Tensorflow
Building our neural network using Keras from Tensorflow. You will also leverage the power of transfer learning to build a power CNN based classifier.
-
15Creating data pipelines using generators
Creating data pipelines for augmenting our dataset. You will create generators that can generate data during the training process. All of this using ImageDataGenerator from Keras.
-
16Putting everything together inside a train function
You will create a training function that makes use of the data pipelines and the deep learning model that you created before.
-
17Improving and cleaning the code for robustness and automation
A clean code is a robust code. In this lecture, we will clean our code for robustness and better readability.
-
18Launching training locally on a subset of our data
Testing our code on a subset of the data to make sure that our pipeline is working properly.
-
19Adding evaluation at the end of training
Adding the evaluation part to our code. We will be using our evaluation generator to evaluate our model.
-
20Summary
Some final notes about this section.
Dockerizing our code
-
21Our different setups for reading data during the training
Reading data during the training is a crucial part of your machine learning pipeline. In this lecture, I will outline our current setup and what we aim for by the time we start training on google cloud platform.
-
22What are buckets and how to create them
An introduction to google buckets and how to create them.
-
23Uploading our data to the bucket
In this lecture you will upload your dataset to a google cloud bucket that you previously created.
-
24Creating a credentials json file to allow access to our bucket
In order to access your data (that you've put on the cloud) from anywhere, you need to have the right credentials. In this lecture, I will show you how to create credentials in a form of a JSON file.
-
25Problem with our credentials file and how to fix it
The previous credentials file that we created did not allow us to access the data on our google bucket. In this lecture, I will show you why and how to fix that.
-
26Adding code for downloading data from the bucket
In this lecture, we will add a function that allows us to download data from google bucket.
-
27Verifying that our training pipeline is working properly with the new modificati
In this lecture, we will run our training pipeline to see if everything is working as it should.
Training our deep learning model on AI-Platform
-
28What is docker and how to use it for our project? (optional)
A brief description of what Docker is and how to use it from a high level perspective.
-
29Small modifications to our files
Modifying our previous code to prepare it for docker.
-
30Building a docker image using dockerfiles
How to create a Dockerfile for building docker images.
-
31Running a docker container using our docker image
How to run docker containers using our docker image.
-
32Adding arguments to our training application using Argparse
Adding arguments to our training script.
-
33Necessary steps to use Docker with GPUs
Adding all the necessary steps to be able to use GPUs with Docker.
-
34Building our docker image with GPU support
Building our new image after we set up our machine to use GPUs with Docker.
-
35Summary
Summary and final notes about this section.
Serving our trained model as a web app using Cloud Run and Flask
-
36What is cloud computing and what is AI-Platform? (optional)
A brief explanation of what cloud computing is and what AI-Platform is.
-
37What other APIs do we need?
Enabling google cloud APIs that we need for our training process on the cloud.
-
38Pushing our image to Google Container Registry
In this lecture, you will learn how to push your docker image to google container registry.
-
39Setting up things for our training job
Preparing all the necessary setup to run a training job on AI-Platform.
-
40Launching a training job on AI-Platform and checking the logs
Launching our training job on AI-Platform and checking the logs coming from the machine that is running the training on the cloud.
-
41What is hyperparameters tuning?
An introduction to hyperparameters tuning.
-
42Configuring hyperparameters tuning
Configuring our code to allow AI-Platform to perform hyperparameters tuning automatically.
-
43Building a new docker image with the new setup
Building a new docker image with the hyperparameters tuning setup.
-
44Launching a training job with the new setup
Launching the training process with hyperparameters tuning configured.
-
45Saving our trained model (but there is a problem)
Saving our trained deep learning models by adding callbacks from Keras.
-
46Adding function to upload trained models to a google bucket
In this lecture, you will learning how to upload data to google cloud buckets from your python code.
-
47Zipping and uploading trained models to google storage
In this lecture, you will learn how to zip and upload your best trained models to google cloud bucket.
-
48Running the final training job
In this video, we will configure our code to finally run the training on the complete dataset.
-
49Summary
Summary and final notes about this section.