ExamGecko
Home / Microsoft / DP-300 / List of questions
Ask Question

Microsoft DP-300 Practice Test - Questions Answers, Page 10

List of questions

Question 91

Report
Export
Collapse

Note: This question-is part of a series of questions that present the same scenario. Each question-in the series contains a unique solution that might meet the stated goals. Some question-sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question-in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You have an Azure Data Lake Storage account that contains a staging zone.

You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics. Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes mapping data flow, and then inserts the data into the data warehouse. Does this meet the goal?

Yes
Yes
No
No
Suggested answer: B

Explanation:

If you need to transform data in a way that is not supported by Data Factory, you can create a custom activity, not a mapping flow,5 with your own data processing logic and use the activity in the pipeline. You can create a custom activity to run R scripts on your HDInsight cluster with R installed.

Reference:

https://docs.microsoft.com/en-US/azure/data-factory/transform-data

asked 02/10/2024
Edmond Chui
35 questions

Question 92

Report
Export
Collapse

Note: This question-is part of a series of questions that present the same scenario. Each question-in the series contains a unique solution that might meet the stated goals. Some question-sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question-in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You have an Azure Data Lake Storage account that contains a staging zone.

You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics. Solution: You schedule an Azure Databricks job that executes an R notebook, and then inserts the data into the data warehouse. Does this meet the goal?

Yes
Yes
No
No
Suggested answer: B

Explanation:

Must use an Azure Data Factory, not an Azure Databricks job.

Reference:

https://docs.microsoft.com/en-US/azure/data-factory/transform-data

asked 02/10/2024
Tomas Ojeda
41 questions

Question 93

Report
Export
Collapse

Note: This question-is part of a series of questions that present the same scenario. Each question-in the series contains a unique solution that might meet the stated goals. Some question-sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question-in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You have an Azure Data Lake Storage account that contains a staging zone.

You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics. Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes an Azure Databricks notebook, and then inserts the data into the data warehouse. Does this meet the goal?

Yes
Yes
No
No
Suggested answer: B

Explanation:

If you need to transform data in a way that is not supported by Data Factory, you can create a custom activity, not an Azure Databricks notebook, with your own data processing logic and use the activity in the pipeline. You can create a custom activity to run R scripts on your HDInsight cluster with R installed.

Reference:

https://docs.microsoft.com/en-US/azure/data-factory/transform-data

asked 02/10/2024
Steven Moran
32 questions

Question 94

Report
Export
Collapse

Note: This question-is part of a series of questions that present the same scenario. Each question-in the series contains a unique solution that might meet the stated goals. Some question-sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question-in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen. You have an Azure Data Lake Storage account that contains a staging zone.

You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics. Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that copies the data to a staging table in the data warehouse, and then uses a stored procedure to execute the R script. Does this meet the goal?

Yes
Yes
No
No
Suggested answer: A

Explanation:

If you need to transform data in a way that is not supported by Data Factory, you can create a custom activity with your own data processing logic and use the activity in the pipeline. You can create a custom activity to run R scripts on your HDInsight cluster with R installed.

Reference:

https://docs.microsoft.com/en-US/azure/data-factory/transform-data

asked 02/10/2024
Muzammil Mirza
33 questions

Question 95

Report
Export
Collapse

HOTSPOT

You have an Azure Data Factory instance named ADF1 and two Azure Synapse Analytics workspaces named WS1 and WS2.

ADF1 contains the following pipelines:

P1:Uses a copy activity to copy data from a nonpartitioned table in a dedicated SQL pool of WS1 to an Azure Data Lake Storage Gen2 account P2:Uses a copy activity to copy data from text-delimited files in an Azure Data Lake Storage Gen2 account to a nonpartitioned table in a dedicated SQL pool of WS2

You need to configure P1 and P2 to maximize parallelism and performance.

Which dataset settings should you configure for the copy activity of each pipeline? To answer, select the appropriate options in the answer area.


Microsoft DP-300 image Question 95 89935 10022024015907000
Correct answer: Microsoft DP-300 image answer Question 95 89935 10022024015907000

Explanation:

P1: Set the Partition option to Dynamic Range.

The SQL Server connector in copy activity provides built-in data partitioning to copy data in parallel.

P2: Set the Copy method to PolyBase

Polybase is the most efficient way to move data into Azure Synapse Analytics. Use the staging blob feature to achieve high load speeds from all types of data stores, including Azure Blob storage and Data Lake Store. (Polybase supports Azure Blob storage and Azure Data Lake Store by default.)

Reference:

https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-sql-data-warehouse

https://docs.microsoft.com/en-us/azure/data-factory/load-azure-sql-data-warehouse

asked 02/10/2024
Joseph Varghese
41 questions

Question 96

Report
Export
Collapse
Microsoft DP-300 image Question 96 89936 10022024015907000
Correct answer: Microsoft DP-300 image answer Question 96 89936 10022024015907000

Explanation:

Box 1: No

Just one failure within the 5-minute interval.

Box 2: No

Just two failures within the 5-minute interval.

Box 3: No

Just two failures within the 5-minute interval.

Reference:

https://docs.microsoft.com/en-us/azure/azure-monitor/alerts/alerts-metric-overview

asked 02/10/2024
saharat pinsaran
43 questions

Question 97

Report
Export
Collapse

DRAG DROP

You have an Azure subscription that contains an Azure SQL managed instance named SQLMi1 and a SQL Agent job named Backupdb. Backupdb performs a daily backup of the databases hosted on SQLMi1.

You need to be notified by email if the job fails.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select.


Microsoft DP-300 image Question 97 89937 10022024015907000
Correct answer: Microsoft DP-300 image answer Question 97 89937 10022024015907000

Explanation:

Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/managed-instance/job-automation-managed-instance

asked 02/10/2024
Micele Mercer
38 questions

Question 98

Report
Export
Collapse

DRAG DROP

You have SQL Server on an Azure virtual machine.

You need to use Policy-Based Management in Microsoft SQL Server to identify stored procedures that do not comply with your naming conventions.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.


Microsoft DP-300 image Question 98 89938 10022024015907000
Correct answer: Microsoft DP-300 image answer Question 98 89938 10022024015907000

Explanation:

Reference:

https://www.mssqltips.com/sqlservertip/2298/enforce-sql-server-database-naming-conventions-using-policy-based-management/

asked 02/10/2024
Hayat Hassan
44 questions

Question 99

Report
Export
Collapse

You have an Azure SQL managed instance named SQLMI1 that hosts 10 databases.

You need to implement alerts by using Azure Monitor. The solution must meet the following requirements:

Minimize costs.

Aggregate Intelligent Insights telemetry from each database.

What should you do?

From the Diagnostic settings of each database, select Send to Log Analytics.
From the Diagnostic settings of each database, select Send to Log Analytics.
From the Diagnostic settings of each database, select Stream to an event hub.
From the Diagnostic settings of each database, select Stream to an event hub.
From the Diagnostic settings of SQLMI1, select Send to Log Analytics.
From the Diagnostic settings of SQLMI1, select Send to Log Analytics.
From the Diagnostic settings of SQLMI1, select Stream to an event hub.
From the Diagnostic settings of SQLMI1, select Stream to an event hub.
Suggested answer: A

Explanation:

Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/metrics-diagnostic-telemetry-logging-streaming-export-configure?tabs=azure-portal#configure-the-streaming-export-of-diagnostic-telemetry

asked 02/10/2024
Francisco Julian Mota Fraile
41 questions

Question 100

Report
Export
Collapse

You have an Azure SQL managed instance that hosts multiple databases.

You need to configure alerts for each database based on the diagnostics telemetry of the database.

What should you use?

Azure SQL Analytics alerts based on metrics
Azure SQL Analytics alerts based on metrics
SQL Health Check alerts based on diagnostics logs
SQL Health Check alerts based on diagnostics logs
SQL Health Check alerts based on metrics
SQL Health Check alerts based on metrics
Azure SQL Analytics alerts based on diagnostics logs
Azure SQL Analytics alerts based on diagnostics logs
Suggested answer: D

Explanation:

Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/metrics-diagnostic-telemetry-logging-streaming-export-configure?tabs=azure-portal#configure-the-streaming-export-of-diagnostic-telemetry

asked 02/10/2024
Angel Molina
41 questions
Total 342 questions
Go to page: of 35
Search

Related questions