- Home
- >
- Software Development
- >
- Using a Pre-Trained ONNX Model for Inferencing – InApps Technology 2022
Using a Pre-Trained ONNX Model for Inferencing – InApps Technology is an article under the topic Software Development Many of you are most interested in today !! Today, let’s InApps.net learn Using a Pre-Trained ONNX Model for Inferencing – InApps Technology in today’s post !
Read more about Using a Pre-Trained ONNX Model for Inferencing – InApps Technology at Wikipedia
You can find content about Using a Pre-Trained ONNX Model for Inferencing – InApps Technology from the Wikipedia website
In the previous part of this series, I introduced the Open Neural Network Exchange (ONNX) and the ONNX Runtime as the interoperable toolkit and platform for machine learning and deep models.
In this tutorial, we will explore how to use an existing ONNX model for inferencing. In just 30 lines of code that includes preprocessing of the input image, we will perform the inference of the MNIST model to predict the number from an image.
The objective of this tutorial is to make you familiar with the ONNX file format and runtime.
Setting up the Environment
To complete this tutorial, you need Python 3.x running on your machine. We will start by creating a Python3 virtual environment to isolate it from the main Python environment on the machine.
python3 –m venv onnx_mnist source onnx_mnist/bin/activate |
With the virtual environment in place, let’s install the Python modules needed by our program. The following command will install ONNX, ONNX Runtime, and OpenCV in your environment.
pip install onnx onnxruntime opencv–python |
Let’s download and expand the MNIST pre-trained model trained in Microsoft CNTK Toolkit from the ONNX Model Zoo.
wget https://www.cntk.ai/OnnxModels/mnist/opset_7/mnist.tar.gz |
The above command results in a new directory called mnist that has the model and the test data serialized into ProtoBuf files. We are not going to use the test data for the tutorial.
We can now examine the model through the Netron tool by opening the model.onnx file.
The MNIST model from the ONNX Model Zoo uses maxpooling to update the weights in its convolutions as shown in the graph from Netron.
The model has two convolutional layers, two maxpool layers, one dense layer, and an output layer that can classify one of the 10 values representing the labels used in the MNIST dataset.
Writing Inference Code for Prediction
We will now write code for performing inference on the pre-trained MNIST model.
Let’s start by importing the right Python modules.
import json import sys import os import time import numpy as np import cv2 import onnx import onnxruntime from onnx import numpy_helper |
Notice that we are using ONNX, ONNX Runtime, and the NumPy helper modules related to ONNX.
The ONNX module helps in parsing the model file while the ONNX Runtime module is responsible for creating a session and performing inference.
Next, we will initialize some variables to hold the path of the model files and command-line arguments.
model_dir =“./mnist” model=model_dir+“/model.onnx” path=sys.argv[1] |
In the next step, we will load the image and preprocess it with OpenCV.
#Preprocess the image img = cv2.imread(path) img = np.dot(img[...,:3], [0.299, 0.587, 0.114]) img = cv2.resize(img, dsize=(28, 28), interpolation=cv2.INTER_AREA) img.resize((1, 1, 28, 28)) |
The above code snippet is responsible for converting the image to grayscale and resizing it to 28X28 array. This array will be used as an input to the model.
We will now convert the image into a NumPy array of type float32.
data = json.dumps({‘data’: img.tolist()}) data = np.array(json.loads(data)[‘data’]).astype(‘float32’) |
We are now ready to pass the data to the model for inference.
data = json.dumps({‘data’: img.tolist()}) data = np.array(json.loads(data)[‘data’]).astype(‘float32’) session = onnxruntime.InferenceSession(model, None) input_name = session.get_inputs()[0].name output_name = session.get_outputs()[0].name print(input_name) print(output_name) |
We need to use the same name as the input layer and the output layer of the neural network. You can easily retrieve them from the session.getinputs() and session.getoutputs() methods. The output from the above snippet matches the input and output node names shown by Netron.
Let’s pass the input to the session and print the prediction.
result = session.run([output_name], {input_name: data}) prediction=int(np.argmax(np.array(result).squeeze(), axis=0)) print(prediction) |
We apply the argmax function of NumPy to retrieve the value with the highest probability.
Try running the code by passing an image of a handwritten number. It predicts that with good probability.
Here is the complete code for your reference:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 | import json import sys import os import time import numpy as np import cv2 import onnx import onnxruntime from onnx import numpy_helper
model_dir =“./mnist” model=model_dir+“/model.onnx” path=sys.argv[1]
#Preprocess the image img = cv2.imread(path) img = np.dot(img[...,:3], [0.299, 0.587, 0.114]) img = cv2.resize(img, dsize=(28, 28), interpolation=cv2.INTER_AREA) img.resize((1, 1, 28, 28))
data = json.dumps({‘data’: img.tolist()}) data = np.array(json.loads(data)[‘data’]).astype(‘float32’) session = onnxruntime.InferenceSession(model, None) input_name = session.get_inputs()[0].name output_name = session.get_outputs()[0].name #print(input_name) #print(output_name)
result = session.run([output_name], {input_name: data}) prediction=int(np.argmax(np.array(result).squeeze(), axis=0)) print(prediction) |
In the next part of this tutorial, we will learn how to export a PyTorch model and converting that into a TensorFlow saved model file. Stay tuned.
Janakiram MSV’s Webinar series, “Machine Intelligence and Modern Infrastructure (MI2)” offers informative and insightful sessions covering cutting-edge technologies. Sign up for the upcoming MI2 webinar at http://mi2.live.
Feature image by DavidRockDesign from Pixabay.
Source: InApps.net
Let’s create the next big thing together!
Coming together is a beginning. Keeping together is progress. Working together is success.