ExamGecko
Home / Microsoft / DP-700 / List of questions
Ask Question

Microsoft DP-700 Practice Test - Questions Answers

Add to Whishlist

List of questions

Question 1

Report Export Collapse

You need to ensure that the data analysts can access the gold layer lakehouse.

What should you do?

Add the DataAnalyst group to the Viewer role for WorkspaceA.

Add the DataAnalyst group to the Viewer role for WorkspaceA.

Share the lakehouse with the DataAnalysts group and grant the Build reports on the default semantic model permission.

Share the lakehouse with the DataAnalysts group and grant the Build reports on the default semantic model permission.

Share the lakehouse with the DataAnalysts group and grant the Read all SQL Endpoint data permission.

Share the lakehouse with the DataAnalysts group and grant the Read all SQL Endpoint data permission.

Share the lakehouse with the DataAnalysts group and grant the Read all Apache Spark permission.

Share the lakehouse with the DataAnalysts group and grant the Read all Apache Spark permission.

Suggested answer: C
Explanation:

Data Analysts' Access Requirements must only have read access to the Delta tables in the gold layer and not have access to the bronze and silver layers.

The gold layer data is typically queried via SQL Endpoints. Granting the Read all SQL Endpoint data permission allows data analysts to query the data using familiar SQL-based tools while restricting access to the underlying files.

asked 04/04/2025
carlos soto
42 questions

Question 2

Report Export Collapse

HOTSPOT

You need to recommend a method to populate the POS1 data to the lakehouse medallion layers.

What should you recommend for each layer? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.


Microsoft DP-700 image Question 2 145258 04042025072919000
Correct answer: Microsoft DP-700 image answer Question 2 145258 04042025072919000
asked 04/04/2025
Fatawu Musah
39 questions

Question 3

Report Export Collapse

You need to ensure that usage of the data in the Amazon S3 bucket meets the technical requirements.

What should you do?

Create a workspace identity and enable high concurrency for the notebooks.

Create a workspace identity and enable high concurrency for the notebooks.

Create a shortcut and ensure that caching is disabled for the workspace.

Create a shortcut and ensure that caching is disabled for the workspace.

Create a workspace identity and use the identity in a data pipeline.

Create a workspace identity and use the identity in a data pipeline.

Create a shortcut and ensure that caching is enabled for the workspace.

Create a shortcut and ensure that caching is enabled for the workspace.

Suggested answer: B
Explanation:

To ensure that the usage of the data in the Amazon S3 bucket meets the technicalrequirements, we must address two key points:Minimize egress costs associated with cross-cloud data access: Using a shortcut ensures thatFabric does not replicate the data from the S3 bucket into the lakehouse but rather providesdirect access to the data in its original location. This minimizes cross-cloud data transfer andavoids additional egress costs.Prevent saving a copy of the raw data in the lakehouses: Disabling caching ensures that the rawdata is not copied or persisted in the Fabric workspace. The data is accessed on-demanddirectly from the Amazon S3 bucket.

asked 04/04/2025
Rita Marques
45 questions

Question 4

Report Export Collapse

HOTSPOT

You need to create the product dimension.

How should you complete the Apache Spark SQL code? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.


Microsoft DP-700 image Question 4 145260 04042025072919000
Correct answer: Microsoft DP-700 image answer Question 4 145260 04042025072919000
asked 04/04/2025
Thomas Schmitt
51 questions

Question 5

Report Export Collapse

You need to populate the MAR1 data in the bronze layer.

Which two types of activities should you include in the pipeline? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

ForEach

ForEach

Copy data

Copy data

WebHook

WebHook

Stored procedure

Stored procedure

Suggested answer: A, B
Explanation:

MAR1 has seven entities, each accessible via a different API endpoint. A ForEach activity is required to iterate over these endpoints to fetch data from each one. It enables dynamic execution of API calls for each entity.

The Copy data activity is the primary mechanism to extract data from REST APIs and load it into the bronze layer in Delta format. It supports native connectors for REST APIs and Delta, minimizing development effort.

asked 04/04/2025
Nguyen Tan Hung
55 questions

Question 6

Report Export Collapse

You need to schedule the population of the medallion layers to meet the technical requirements.

What should you do?

Schedule a data pipeline that calls other data pipelines.

Schedule a data pipeline that calls other data pipelines.

Schedule a notebook.

Schedule a notebook.

Schedule an Apache Spark job.

Schedule an Apache Spark job.

Schedule multiple data pipelines.

Schedule multiple data pipelines.

Suggested answer: A
Explanation:

The technical requirements specify that:

Medallion layers must be fully populated sequentially (bronze silver gold). Each layer must be populated before the next.

If any step fails, the process must notify the data engineers.

Data imports should run simultaneously when possible.

Why Use a Data Pipeline That Calls Other Data Pipelines?

A data pipeline provides a modular and reusable approach to orchestrating the sequential population of medallion layers.

By calling other pipelines, each pipeline can focus on populating a specific layer (bronze, silver, or gold), simplifying development and maintenance.

A parent pipeline can handle:

- Sequential execution of child pipelines.

- Error handling to send email notifications upon failures.

- Parallel execution of tasks where possible (e.g., simultaneous imports into the bronze layer).

asked 04/04/2025
Abdulilah Alhousainy
37 questions

Question 7

Report Export Collapse

You need to implement the solution for the book reviews.

Which should you do?

Create a Dataflow Gen2 dataflow.

Create a Dataflow Gen2 dataflow.

Create a shortcut.

Create a shortcut.

Enable external data sharing.

Enable external data sharing.

Create a data pipeline.

Create a data pipeline.

Suggested answer: B
Explanation:

The requirement specifies that Litware plans to make the book reviews available in the lakehouse without making a copy of the data. In this case, creating a shortcut in Fabric is the most appropriate solution. A shortcut is a reference to the external data, and it allows Litware to access the book reviews stored in Amazon S3 without duplicating the data into the lakehouse.

asked 04/04/2025
Marcelo Oliveira
44 questions

Question 8

Report Export Collapse

You need to resolve the sales data issue. The solution must minimize the amount of data transferred.

What should you do?

Spilt the dataflow into two dataflows.

Spilt the dataflow into two dataflows.

Configure scheduled refresh for the dataflow.

Configure scheduled refresh for the dataflow.

Configure incremental refresh for the dataflow. Set Store rows from the past to 1 Month.

Configure incremental refresh for the dataflow. Set Store rows from the past to 1 Month.

Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Year.

Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Year.

Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Month.

Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Month.

Suggested answer: E
Explanation:

The sales data issue can be resolved by configuring incremental refresh for the dataflow. Incremental refresh allows for only the new or changed data to be processed, minimizing the amount of data transferred and improving performance.

The solution specifies that data older than one month never changes, so setting the refresh period to 1 Month is appropriate. This ensures that only the most recent month of data will be refreshed, reducing unnecessary data transfers.

asked 04/04/2025
Tuukka Valkeasuo
49 questions

Question 9

Report Export Collapse

HOTSPOT

You need to troubleshoot the ad-hoc query issue.

How should you complete the statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.


Microsoft DP-700 image Question 9 145265 04042025072919000
Correct answer: Microsoft DP-700 image answer Question 9 145265 04042025072919000
asked 04/04/2025
H Barral Vila
36 questions

Question 10

Report Export Collapse

DRAG DROP

You need to ensure that the authors can see only their respective sales data.

How should you complete the statement? To answer, drag the appropriate values the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content

NOTE: Each correct selection is worth one point.


Microsoft DP-700 image Question 10 145266 04042025072919000
Correct answer: Microsoft DP-700 image answer Question 10 145266 04042025072919000
asked 04/04/2025
Jozsef Stelly
56 questions
Total 100 questions
Go to page: of 10
Search

Related questions