ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 1 - DP-100 discussion

Report
Export

You create a deep learning model for image recognition on Azure Machine Learning service using GPU-based training.

You must deploy the model to a context that allows for real-time GPU-based inferencing.

You need to configure compute resources for model inferencing.

Which compute type should you use?

A.
Azure Container Instance
Answers
A.
Azure Container Instance
B.
Azure Kubernetes Service
Answers
B.
Azure Kubernetes Service
C.
Field Programmable Gate Array
Answers
C.
Field Programmable Gate Array
D.
Machine Learning Compute
Answers
D.
Machine Learning Compute
Suggested answer: B

Explanation:

You can use Azure Machine Learning to deploy a GPU-enabled model as a web service. Deploying a model on Azure Kubernetes Service (AKS) is one option. The AKS cluster provides a GPU resource that is used by the model for inference.

Inference, or model scoring, is the phase where the deployed model is used to make predictions. Using GPUs instead of CPUs offers performance advantages on highly parallelizable computation.

Reference: https://docs.microsoft.com/en-us/azure/machine-learning/how-to-deploy-inferencing-gpus

asked 02/10/2024
PHINIT LAORUNGRUANGDECH
44 questions
NextNext
User
Your answer:
0 comments
Sorted by

Leave a comment first