- Home
- >
- Software Development
- >
- Take Amazon SageMaker Studio Lab for a Spin – InApps Technology 2022
Take Amazon SageMaker Studio Lab for a Spin – InApps Technology is an article under the topic Software Development Many of you are most interested in today !! Today, let’s InApps.net learn Take Amazon SageMaker Studio Lab for a Spin – InApps Technology in today’s post !
Read more about Take Amazon SageMaker Studio Lab for a Spin – InApps Technology at Wikipedia
You can find content about Take Amazon SageMaker Studio Lab for a Spin – InApps Technology from the Wikipedia website
Introduced as a preview at the Amazon Web Services‘ re:Invent 2021 conference, SageMaker Studio Lab is a free stand-alone machine learning development environment based on the popular JupyterHub IDE. Except for the branding, the service has almost nothing to do with SageMaker. For a detailed overview of the service, read my previous article.
In this tutorial, I will walk you through the steps of training an end-to-end deep learning model to perform image classification based on Amazon SageMaker Studio Lab. We will build a model that distinguishes between cats and dogs (Be sure to check back all this week for additional SageMaker Studio Lab tutorials).
Step 1: Request Access and Sign In
Visit https://studiolab.sagemaker.aws/ to request a free Amazon SageMaker Studio Lab account.
It may take a few hours to a couple of days for you to get access to the environment. Wait for the email confirmation.
Once approved, sign in to your account with the credentials.
Select GPU compute type, and click on the Start runtime button.
When the runtime is ready, click on Open project.
The JupyterHub environment is ready for experimentation.
Step 2: Preparing the Environment
From the launcher, click on the terminal icon to start a new terminal session. Clone the Git repository that has the Conda environment configuration and the notebooks.
git clone https://github.com/janakiramm/dogs-vs-cats
Navigate to the the dogs-vs-cats
folder, and right click on env_tf2.yaml
file to create a new Conda environment. This file has all the modules needed to train a TensorFlow/Keras model.
Refresh the browser to see a new kernel named tf2:Python
Before we can start training the model, we need to download the dataset. For this, login to Kaggle and download the file train.zip
from the Dogs vs. Cats competition.
Upload the file, train.zip
into the dataset
folder of the repo that we cloned in the previous step. Launch a terminal session and unzip the file in the same folder. You should now have a new folder — /dogs-vs-cats/dataset/train/
.
We now have the environment fully configured to kickoff the training job within Amazon SageMaker Studio Lab.
Step 3: Train the Computer Vision Model to Classify Images
Navigate to the train
folder of the repository and launch dogs-vs-cats.ipynb
notebook.
If prompted for the kernel, choose tf2:Python
.
This notebook loads the dataset we downloaded and trains the image classification model. Run the cells to complete the training. It may take up to 15 minutes for the training to complete.
In my experiment, the model was trained with an accuracy of 87.5%. This may be improved by increasing the number of epochs.
When the model is ready, it is exported to the model/export/Servo/1
directory in the TensorFlow Serving format.
Step 3: Perform Inference on the Trained Model
Navigate to the infer
folder to open the inference notebook. We load the saved model from /model/export/Servo/1/
and use it for inference.
model = tensorflow.keras.models.load_model("../model/export/Servo/1/")
When an image is appropriately resized and preprocessed, it can be sent to the model. Below are the screenshots predicting the correct classes.
You can easily upload the model to Amazon S3 using the Python Boto3 module to deploy it in Amazon SageMaker.
In the next part of this series — which will run all this week — we will utilize the image classification model to create a serverless inference endpoint in Amazon SageMaker. Stay tuned.
Source: InApps.net
Let’s create the next big thing together!
Coming together is a beginning. Keeping together is progress. Working together is success.