Path6.Mod2.a - Deploy and Consume Models - Batch Endpoints Flashcards

1
Q

When to use Batch Endpoints

A

With long-running tasks that deal with large amounts of data performed through batch operations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Batch Inferencing

A

ML uses Batch Inferencing to asynchronously apply a predictive model to multiple cases (like async batch scoring), then persists the results to file or DB or some other datastore connected to your ML Workspace.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

BSJ CC m_i

Batch Endpoints
- When invoked creates this kind of Job
- The Job requires this kind of Compute
- For new data in parallel batches, the Compute has to have more than one

A
  • When the Endpoint is invoked, it submits a Batch Scoring Job for execution
  • Batch Scoring Jobs require a Compute Cluster for scoring multiple inputs
  • The Compute Cluster requires a max_instances value greater than 1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Create an instance of the BatchEndpoint class

A

Straightforward and pretty simple:

endpoint = BatchEndpoint(
    name="endpoint-example",
    description="A batch endpoint",
)

ml_client.batch_endpoints.begin_create_or_update(endpoint)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

The Batch Endpoint name does NOT have to be unique per Region, but must be unique per Workspace (T/F)

A

False. The name must always be unique per Region

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

A BatchEndpoint instance must be created first, before Deployments are added to it (T/F)

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

You can deploy multiple Models to a BatchEndpoint (T/F)

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

The type of Deployment the Batch Scoring Job uses, unless otherwise specified and how to trigger the Deployment

A

It uses the default deployment. Triggered by hitting the batch endpoint.

Upon viewing the Scoring Job’s details, you’ll see under Deployment Summary the deployment details:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Create an instance of an AmlComputer class for a Compute Cluster that will support Batch Endpoints

A
from azure.ai.ml.entities import AmlCompute

cpucluster = AmlCompute(
    name="aml-cluster",
    type="amlcompute",
    size="STANDARDDS11V2",
    mininstances=0,
    maxinstances=4,  // main thing is to specify more than one max instances
    idletimebeforescaledown=120,
    tier="Dedicated",
)

cpucluster = mlclient.compute.begin_create_or_update(cpucluster)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

The advantage of using a Compute Cluster for running the Scoring Script

A

You can run the Score Script on separate instances in parallel.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly