Train and deploy deep learning models
- Description
- Curriculum
- FAQ
- Reviews
This course will take you through the steps that a machine learning engineer would take to train and deploy a deep learning model. We will start the course by defining an end goal that we want to achieve. Then, we will download a dataset that will help us achieve that goal. We will build a Convolutional Neural Network using Tensorflow with Keras and then we will train this network on Google AI-Platform. After saving the best trained model, we will deploy it as a web app using Flask and Google Cloud Run. Throughout the course, we will be using Docker to containerize our code.
-
1Introduction and course contentVideo lesson
Introduction to the course and an outline of the course sections and content.
-
2Operating System, Python IDE and DockerVideo lesson
Setting up your local machine with the right software.
-
3Setting up Google Cloud PlatformVideo lesson
Creating a google cloud account with 300$ free cloud credit.
-
4Setting up a VS code folder and creating a virtual environment using VirtualenvVideo lesson
Creating a setup for our code on visual studio code.
-
5Python packages we will be using and how to install themVideo lesson
Installing python packages that we will use during the course from a requirements.txt file that I will provide you.
-
6Testing your installation and setupVideo lesson
Testing our setup by running a few commands and some basic import code.
-
7How do machine learning or deep learning projects usually work?Video lesson
A description of how machine learning projects work from a high level perspective.
-
8What is our end goal?Video lesson
Defining our end goal before we start diving into building our machine learning pipeline.
-
9Downloading the datasetVideo lesson
Where to get the dataset that we will be using throughout the course.
-
10Data exploration : splitting data into category foldersVideo lesson
How to organize our dataset by splitting the data into folders that contain images of the same class.
-
11Data exploration : visualizing random samples from the datasetVideo lesson
Adding a function that can help us visualize random samples from our dataset.
-
12Data exploration : getting insights about widths and heights of imagesVideo lesson
Getting some insights from our dataset by computing mean and medium values of widths and heights of images.
-
13What to consider when building a neural network for our task?Video lesson
Things that you should consider when building a neural network to solve a specific problem.
-
14Building the neural network architecture using Keras and TensorflowVideo lesson
Building our neural network using Keras from Tensorflow. You will also leverage the power of transfer learning to build a power CNN based classifier.
-
15Creating data pipelines using generatorsVideo lesson
Creating data pipelines for augmenting our dataset. You will create generators that can generate data during the training process. All of this using ImageDataGenerator from Keras.
-
16Putting everything together inside a train functionVideo lesson
You will create a training function that makes use of the data pipelines and the deep learning model that you created before.
-
17Improving and cleaning the code for robustness and automationVideo lesson
A clean code is a robust code. In this lecture, we will clean our code for robustness and better readability.
-
18Launching training locally on a subset of our dataVideo lesson
Testing our code on a subset of the data to make sure that our pipeline is working properly.
-
19Adding evaluation at the end of trainingVideo lesson
Adding the evaluation part to our code. We will be using our evaluation generator to evaluate our model.
-
20SummaryVideo lesson
Some final notes about this section.
-
21Our different setups for reading data during the trainingVideo lesson
Reading data during the training is a crucial part of your machine learning pipeline. In this lecture, I will outline our current setup and what we aim for by the time we start training on google cloud platform.
-
22What are buckets and how to create themVideo lesson
An introduction to google buckets and how to create them.
-
23Uploading our data to the bucketVideo lesson
In this lecture you will upload your dataset to a google cloud bucket that you previously created.
-
24Creating a credentials json file to allow access to our bucketVideo lesson
In order to access your data (that you've put on the cloud) from anywhere, you need to have the right credentials. In this lecture, I will show you how to create credentials in a form of a JSON file.
-
25Problem with our credentials file and how to fix itVideo lesson
The previous credentials file that we created did not allow us to access the data on our google bucket. In this lecture, I will show you why and how to fix that.
-
26Adding code for downloading data from the bucketVideo lesson
In this lecture, we will add a function that allows us to download data from google bucket.
-
27Verifying that our training pipeline is working properly with the new modificatiVideo lesson
In this lecture, we will run our training pipeline to see if everything is working as it should.
-
28What is docker and how to use it for our project? (optional)Video lesson
A brief description of what Docker is and how to use it from a high level perspective.
-
29Small modifications to our filesVideo lesson
Modifying our previous code to prepare it for docker.
-
30Building a docker image using dockerfilesVideo lesson
How to create a Dockerfile for building docker images.
-
31Running a docker container using our docker imageVideo lesson
How to run docker containers using our docker image.
-
32Adding arguments to our training application using ArgparseVideo lesson
Adding arguments to our training script.
-
33Necessary steps to use Docker with GPUsVideo lesson
Adding all the necessary steps to be able to use GPUs with Docker.
-
34Building our docker image with GPU supportVideo lesson
Building our new image after we set up our machine to use GPUs with Docker.
-
35SummaryVideo lesson
Summary and final notes about this section.
-
36What is cloud computing and what is AI-Platform? (optional)Video lesson
A brief explanation of what cloud computing is and what AI-Platform is.
-
37What other APIs do we need?Video lesson
Enabling google cloud APIs that we need for our training process on the cloud.
-
38Pushing our image to Google Container RegistryVideo lesson
In this lecture, you will learn how to push your docker image to google container registry.
-
39Setting up things for our training jobVideo lesson
Preparing all the necessary setup to run a training job on AI-Platform.
-
40Launching a training job on AI-Platform and checking the logsVideo lesson
Launching our training job on AI-Platform and checking the logs coming from the machine that is running the training on the cloud.
-
41What is hyperparameters tuning?Video lesson
An introduction to hyperparameters tuning.
-
42Configuring hyperparameters tuningVideo lesson
Configuring our code to allow AI-Platform to perform hyperparameters tuning automatically.
-
43Building a new docker image with the new setupVideo lesson
Building a new docker image with the hyperparameters tuning setup.
-
44Launching a training job with the new setupVideo lesson
Launching the training process with hyperparameters tuning configured.
-
45Saving our trained model (but there is a problem)Video lesson
Saving our trained deep learning models by adding callbacks from Keras.
-
46Adding function to upload trained models to a google bucketVideo lesson
In this lecture, you will learning how to upload data to google cloud buckets from your python code.
-
47Zipping and uploading trained models to google storageVideo lesson
In this lecture, you will learn how to zip and upload your best trained models to google cloud bucket.
-
48Running the final training jobVideo lesson
In this video, we will configure our code to finally run the training on the complete dataset.
-
49SummaryVideo lesson
Summary and final notes about this section.
External Links May Contain Affiliate Links read more