Deploying machine learning models with Azure Machine Learning Flashcards
What means inferencing in machine learning
In machine Learning, inferencing refers to the use of a trained model to predict labels for new data on which the model has not been trained. Often, the model is deployed as part of a service that enables applications to request immediate, or real-time, predictions for individual or small numbers of data observations
What is that AML uses for its deployment mechanism
Azure Machine Learning uses containers as a deployment mechanism, packaging the model and the code to use it as an image that can be deployed to a container in your chosen compute target
What steps do you have to go through to deploy a real-time inferencing service
- Register a trained model
- Define an inference configuration
- Define a Deploymenet Configuration
- Deploy the model
Steps do you have to go through to deploy a real-time inferencing service:
Define an Inference Configuration
When the model is deployed, it is a service that consist of two things, namely:
- A script to load the model and return predictions for submitted data
- An environment in which the script will be run
You must therefore define a script and environment for the service
For an Inference Configuration you must create an entry script (sometimes referred to as the scoring script) for the service as a Python file. It must include two functions. Which ones?
- init(): called when the service is initialized
- run(raw_data): called when the new data is submitted to the service
Typically you use the init function to load the model from the model registry, and use the run function to generate predictions from the input data
For an Inference Configuration you must create an entry script and an environment. What is done afterwards?
Defining a Deployment Configuration
What can you use for testing within the real-time inferencing service?
For testing you can use the Azure Machine Learning SDK to call a web service through the run method of a WebService object that references the deployed service.