ExamGecko
Home / Microsoft / DP-100 / List of questions
Ask Question

Microsoft DP-100 Practice Test - Questions Answers, Page 3

List of questions

Question 21

Report Export Collapse

You develop and train a machine learning model to predict fraudulent transactions for a hotel booking website.

Traffic to the site varies considerably. The site experiences heavy traffic on Monday and Friday and much lower traffic on other days. Holidays are also high web traffic days.

You need to deploy the model as an Azure Machine Learning real-time web service endpoint on compute that can dynamically scale up and down to support demand.

Which deployment compute option should you use?

attached Azure Databricks cluster
attached Azure Databricks cluster
Azure Container Instance (ACI)
Azure Container Instance (ACI)
Azure Kubernetes Service (AKS) inference cluster
Azure Kubernetes Service (AKS) inference cluster
Azure Machine Learning Compute Instance
Azure Machine Learning Compute Instance
attached virtual machine in a different region
attached virtual machine in a different region
Suggested answer: E
Explanation:

Azure Machine Learning compute cluster is a managed-compute infrastructure that allows you to easily create a single or multi-node compute. The compute is created within your workspace region as a resource that can be shared with other users in your workspace. The compute scales up automatically when a job is submitted, and can be put in an Azure Virtual Network.

Reference: https://docs.microsoft.com/en-us/azure/machine-learning/how-to-create-attach-compute-sdk Question Set 1

asked 02/10/2024
Samya Sharab
41 questions

Question 22

Report Export Collapse

HOTSPOT

You are a lead data scientist for a project that tracks the health and migration of birds. You create a multi-image classification deep learning model that uses a set of labeled bird photos collected by experts. You plan to use the model to develop a cross-platform mobile app that predicts the species of bird captured by app users.

You must test and deploy the trained model as a web service. The deployed model must meet the following requirements:

An authenticated connection must not be required for testing.

The deployed model must perform with low latency during inferencing.

The REST endpoints must be scalable and should have a capacity to handle large number of requests when multiple end users are using the mobile application.

You need to verify that the web service returns predictions in the expected JSON format when a valid REST request is submitted.

Which compute resources should you use? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.


Microsoft DP-100 image Question 22 89247 10022024015826000
Correct answer: Microsoft DP-100 image answer Question 22 89247 10022024015826000
Explanation:

Box 1: ds-workstation notebook VM

An authenticated connection must not be required for testing.

On a Microsoft Azure virtual machine (VM), including a Data Science Virtual Machine (DSVM), you create local user accounts while provisioning the VM. Users then authenticate to the VM by using these credentials.

Box 2: gpu-compute cluster

Image classification is well suited for GPU compute clusters

Reference:

https://docs.microsoft.com/en-us/azure/machine-learning/data-science-virtual-machine/dsvm-common-identity

https://docs.microsoft.com/en-us/azure/architecture/reference-architectures/ai/training-deep-learning

asked 02/10/2024
Jacquezz Shorter
26 questions

Question 23

Report Export Collapse

HOTSPOT

You deploy a model in Azure Container Instance.

You must use the Azure Machine Learning SDK to call the model API.

You need to invoke the deployed model using native SDK classes and methods.

How should you complete the command? To answer, select the appropriate options in the answer areas.

NOTE: Each correct selection is worth one point.


Microsoft DP-100 image Question 23 89248 10022024015826000
Correct answer: Microsoft DP-100 image answer Question 23 89248 10022024015826000
Explanation:

Box 1: from azureml.core.webservice import Webservice

The following code shows how to use the SDK to update the model, environment, and entry script for a web service to Azure Container Instances:

from azureml.core import Environment

from azureml.core.webservice import Webservice

from azureml.core.model import Model, InferenceConfig

Box 2: predictions = service.run(input_json)

Example: The following code demonstrates sending data to the service:

import json

test_sample = json.dumps({'data': [

[1, 2, 3, 4, 5, 6, 7, 8, 9, 10],

[10, 9, 8, 7, 6, 5, 4, 3, 2, 1]

]})

test_sample = bytes(test_sample, encoding='utf8')

prediction = service.run(input_data=test_sample)

print(prediction)

Reference:

https://docs.microsoft.com/bs-latn-ba/azure/machine-learning/how-to-deploy-azure-container-instance

https://docs.microsoft.com/en-us/azure/machine-learning/how-to-troubleshoot-deployment

asked 02/10/2024
Aboudou-Razakou KONI
39 questions

Question 24

Report Export Collapse

HOTSPOT

You use Azure Machine Learning to train and register a model.

You must deploy the model into production as a real-time web service to an inference cluster named service-compute that the IT department has created in the Azure Machine Learning workspace.

Client applications consuming the deployed web service must be authenticated based on their Azure Active Directory service principal.

You need to write a script that uses the Azure Machine Learning SDK to deploy the model. The necessary modules have been imported.

How should you complete the code? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.


Microsoft DP-100 image Question 24 89249 10022024015826000
Correct answer: Microsoft DP-100 image answer Question 24 89249 10022024015826000
Explanation:

Box 1: AksCompute

Example:

aks_target = AksCompute(ws,"myaks")

# If deploying to a cluster configured for dev/test, ensure that it was created with enough

# cores and memory to handle this deployment configuration. Note that memory is also used by

# things such as dependencies and AML components.

deployment_config = AksWebservice.deploy_configuration(cpu_cores = 1, memory_gb = 1)

service = Model.deploy(ws, "myservice", [model], inference_config, deployment_config, aks_target)

Box 2: AksWebservice

Box 3: token_auth_enabled=Yes

Whether or not token auth is enabled for the Webservice.

Note: A Service principal defined in Azure Active Directory (Azure AD) can act as a principal on which authentication and authorization policies can be enforced in Azure Databricks.

The Azure Active Directory Authentication Library (ADAL) can be used to programmatically get an Azure AD access token for a user.

Incorrect Answers:

auth_enabled (bool): Whether or not to enable key auth for this Webservice. Defaults to True.

Reference:

https://docs.microsoft.com/en-us/azure/machine-learning/how-to-deploy-azure-kubernetes-service

https://docs.microsoft.com/en-us/azure/databricks/dev-tools/api/latest/aad/service-prin-aad-token

asked 02/10/2024
Johan Benavides
51 questions

Question 25

Report Export Collapse

DRAG DROP

You use Azure Machine Learning to deploy a model as a real-time web service.

You need to create an entry script for the service that ensures that the model is loaded when the service starts and is used to score new data as it is received.

Which functions should you include in the script? To answer, drag the appropriate functions to the correct actions. Each function may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.


Microsoft DP-100 image Question 25 89250 10022024015826000
Correct answer: Microsoft DP-100 image answer Question 25 89250 10022024015826000
Explanation:

Box 1: init()

The entry script has only two required functions, init() and run(data). These functions are used to initialize the service at startup and run the model using request data passed in by a client. The rest of the script handles loading and running the model(s).

Box 2: run()

Reference:

https://docs.microsoft.com/en-us/azure/machine-learning/how-to-deploy-existing-model

asked 02/10/2024
Mark Arnold Santos
46 questions

Question 26

Report Export Collapse

You use the designer to create a training pipeline for a classification model. The pipeline uses a dataset that includes the features and labels required for model training.

You create a real-time inference pipeline from the training pipeline. You observe that the schema for the generated web service input is based on the dataset and includes the label column that the model predicts. Client applications that use the service must not be required to submit this value.

You need to modify the inference pipeline to meet the requirement.

What should you do?

Add a Select Columns in Dataset module to the inference pipeline after the dataset and use it to select all columns other than the label.
Add a Select Columns in Dataset module to the inference pipeline after the dataset and use it to select all columns other than the label.
Delete the dataset from the training pipeline and recreate the real-time inference pipeline.
Delete the dataset from the training pipeline and recreate the real-time inference pipeline.
Delete the Web Service Input module from the inference pipeline.
Delete the Web Service Input module from the inference pipeline.
Replace the dataset in the inference pipeline with an Enter Data Manually module that includes data for the feature columns but not the label column.
Replace the dataset in the inference pipeline with an Enter Data Manually module that includes data for the feature columns but not the label column.
Suggested answer: A
Explanation:

By default, the Web Service Input will expect the same data schema as the module output data which connects to the same downstream port as it. You can remove the target variable column in the inference pipeline using Select Columns in Dataset module. Make sure that the output of Select Columns in Dataset removing target variable column is connected to the same port as the output of the Web Service Intput module.

Reference:

https://docs.microsoft.com/en-us/azure/machine-learning/tutorial-designer-automobile-price-deploy

asked 02/10/2024
Hitesh Karangiya
37 questions

Question 27

Report Export Collapse

You use the Azure Machine Learning designer to create and run a training pipeline. You then create a real-time inference pipeline.

You must deploy the real-time inference pipeline as a web service.

What must you do before you deploy the real-time inference pipeline?

Run the real-time inference pipeline.
Run the real-time inference pipeline.
Create a batch inference pipeline.
Create a batch inference pipeline.
Clone the training pipeline.
Clone the training pipeline.
Create an Azure Machine Learning compute cluster.
Create an Azure Machine Learning compute cluster.
Suggested answer: D
Explanation:

You need to create an inferencing cluster.

Deploy the real-time endpoint

After your AKS service has finished provisioning, return to the real-time inferencing pipeline to complete deployment.

1. Select Deploy above the canvas.

2. Select Deploy new real-time endpoint.

3. Select the AKS cluster you created.

4. Select Deploy.

Reference:

https://docs.microsoft.com/en-us/azure/machine-learning/tutorial-designer-automobile-price-deploy

asked 02/10/2024
hotthefish shark
38 questions

Question 28

Report Export Collapse

You create an Azure Machine Learning workspace named ML-workspace. You also create an Azure Databricks workspace named DB-workspace. DB-workspace contains a cluster named DB-cluster.

You must use DB-cluster to run experiments from notebooks that you import into DB-workspace.

You need to use ML-workspace to track MLflow metrics and artifacts generated by experiments running on DB-cluster. The solution must minimize the need for custom code.

What should you do?

From DB-cluster, configure the Advanced Logging option.
From DB-cluster, configure the Advanced Logging option.
From DB-workspace, configure the Link Azure ML workspace option.
From DB-workspace, configure the Link Azure ML workspace option.
From ML-workspace, create an attached compute.
From ML-workspace, create an attached compute.
From ML-workspace, create a compute cluster.
From ML-workspace, create a compute cluster.
Suggested answer: B
Explanation:

Connect your Azure Databricks and Azure Machine Learning workspaces:

Linking your ADB workspace to your Azure Machine Learning workspace enables you to track your experiment data in the Azure Machine Learning workspace.

To link your ADB workspace to a new or existing Azure Machine Learning workspace

1. Sign in to Azure portal.

2. Navigate to your ADB workspace's Overview page.

3. Select the Link Azure Machine Learning workspace button on the bottom right.

Microsoft DP-100 image Question 28 explanation 89253 10022024015826000000

Reference:

https://docs.microsoft.com/en-us/azure/machine-learning/how-to-use-mlflow-azure-databricks

asked 02/10/2024
Kefash White
43 questions

Question 29

Report Export Collapse

HOTSPOT

You create an Azure Machine Learning workspace.

You need to detect data drift between a baseline dataset and a subsequent target dataset by using the DataDriftDetector class.

How should you complete the code segment? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.


Microsoft DP-100 image Question 29 89254 10022024015826000
Correct answer: Microsoft DP-100 image answer Question 29 89254 10022024015826000
Explanation:

Box 1: create_from_datasets

The create_from_datasets method creates a new DataDriftDetector object from a baseline tabular dataset and a target time series dataset.

Box 2: backfill

The backfill method runs a backfill job over a given specified start and end date.

Syntax: backfill(start_date, end_date, compute_target=None, create_compute_target=False)

Incorrect Answers:

List and update do not have datetime parameters.

Reference:

https://docs.microsoft.com/en-us/python/api/azureml-datadrift/azureml.datadrift.datadriftdetector(class)

asked 02/10/2024
vault ghz
30 questions

Question 30

Report Export Collapse

You are planning to register a trained model in an Azure Machine Learning workspace.

You must store additional metadata about the model in a key-value format. You must be able to add new metadata and modify or delete metadata after creation.

You need to register the model.

Which parameter should you use?

description
description
model_framework
model_framework
tags
tags
properties
properties
Suggested answer: D
Explanation:

azureml.core.Model.properties:

Dictionary of key value properties for the Model. These properties cannot be changed after registration, however new key value pairs can be added.

Reference:

https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.model

asked 02/10/2024
Krzysztof Dyrdal
50 questions
Total 433 questions
Go to page: of 44
Search

Related questions