ExamGecko
Home Home / Microsoft / DP-600

Microsoft DP-600 Practice Test - Questions Answers

Question list
Search
Search

List of questions

Search

Related questions











HOTSPOT

You to need assign permissions for the data store in the AnalyticsPOC workspace. The solution must meet the security requirements.

Which additional permissions should you assign when you share the data store? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.


Question 1
Correct answer: Question 1

HOTSPOT

You need to create a DAX measure to calculate the average overall satisfaction score.

How should you complete the DAX code? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Question 2
Correct answer: Question 2

DRAG DROP

You have a Fabric tenant that contains a semantic model. The model contains data about retail stores.

You need to write a DAX query that will be executed by using the XMLA endpoint The query must return a table of stores that have opened since December 1,2023.

How should you complete the DAX expression? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.


Question 3
Correct answer: Question 3

HOTSPOT

You need to resolve the issue with the pricing group classification.

How should you complete the T-SQL statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Question 4
Correct answer: Question 4

What should you recommend using to ingest the customer data into the data store in the AnatyticsPOC workspace?

A.
a stored procedure
A.
a stored procedure
Answers
B.
a pipeline that contains a KQL activity
B.
a pipeline that contains a KQL activity
Answers
C.
a Spark notebook
C.
a Spark notebook
Answers
D.
a dataflow
D.
a dataflow
Answers
Suggested answer: D

Explanation:

For ingesting customer data into the data store in the AnalyticsPOC workspace, a dataflow (D) should be recommended. Dataflows are designed within the Power BI service to ingest, cleanse, transform, and load data into the Power BI environment. They allow for the low-code ingestion and transformation of data as needed by Litware's technical requirements. Reference = You can learn more about dataflows and their use in Power BI environments in Microsoft's Power BI documentation.

Which type of data store should you recommend in the AnalyticsPOC workspace?

A.
a data lake
A.
a data lake
Answers
B.
a warehouse
B.
a warehouse
Answers
C.
a lakehouse
C.
a lakehouse
Answers
D.
an external Hive metaStore
D.
an external Hive metaStore
Answers
Suggested answer: C

Explanation:

A lakehouse (C) should be recommended for the AnalyticsPOC workspace. It combines the capabilities of a data warehouse with the flexibility of a data lake. A lakehouse supports semi-structured and unstructured data and allows for T-SQL and Python read access, fulfilling the technical requirements outlined for Litware. Reference = For further understanding, Microsoft's documentation on the lakehouse architecture provides insights into how it supports various data types and analytical operations.

You need to recommend a solution to prepare the tenant for the PoC.

Which two actions should you recommend performing from the Fabric Admin portal? Each correct answer presents part of the solution.

NOTE: Each correct answer is worth one point.

A.
Enable the Users can try Microsoft Fabric paid features option for specific security groups.
A.
Enable the Users can try Microsoft Fabric paid features option for specific security groups.
Answers
B.
Enable the Allow Azure Active Directory guest users to access Microsoft Fabric option for specific security groups.
B.
Enable the Allow Azure Active Directory guest users to access Microsoft Fabric option for specific security groups.
Answers
C.
Enable the Users can create Fabric items option and exclude specific security groups.
C.
Enable the Users can create Fabric items option and exclude specific security groups.
Answers
D.
Enable the Users can try Microsoft Fabric paid features option for the entire organization.
D.
Enable the Users can try Microsoft Fabric paid features option for the entire organization.
Answers
E.
Enable the Users can create Fabric items option for specific security groups.
E.
Enable the Users can create Fabric items option for specific security groups.
Answers
Suggested answer: A, E

Explanation:

The PoC is planned to be completed using a Fabric trial capacity, which implies that users involved in the PoC should be able to try paid features. However, this should be limited to specific security groups involved in the PoC to prevent the entire organization from accessing these features before the trial is proven successful (A). The ability for users to create Fabric items should also be enabled for specific security groups to ensure that only the relevant team members participating in the PoC can create items in the Fabric environment (E).

You need to ensure the data loading activities in the AnalyticsPOC workspace are executed in the appropriate sequence. The solution must meet the technical requirements.

What should you do?

A.
Create a pipeline that has dependencies between activities and schedule the pipeline.
A.
Create a pipeline that has dependencies between activities and schedule the pipeline.
Answers
B.
Create and schedule a Spark job definition.
B.
Create and schedule a Spark job definition.
Answers
C.
Create a dataflow that has multiple steps and schedule the dataflow.
C.
Create a dataflow that has multiple steps and schedule the dataflow.
Answers
D.
Create and schedule a Spark notebook.
D.
Create and schedule a Spark notebook.
Answers
Suggested answer: A

Explanation:

To meet the technical requirement that data loading activities must ensure the raw and cleansed data is updated completely before populating the dimensional model, you would need a mechanism that allows for ordered execution. A pipeline in Microsoft Fabric with dependencies set between activities can ensure that activities are executed in a specific sequence. Once set up, the pipeline can be scheduled to run at the required intervals (hourly or daily depending on the data source).

You need to implement the date dimension in the data store. The solution must meet the technical requirements.

What are two ways to achieve the goal? Each correct answer presents a complete solution.

NOTE: Each correct selection is worth one point.

A.
Populate the date dimension table by using a dataflow.
A.
Populate the date dimension table by using a dataflow.
Answers
B.
Populate the date dimension table by using a Stored procedure activity in a pipeline.
B.
Populate the date dimension table by using a Stored procedure activity in a pipeline.
Answers
C.
Populate the date dimension view by using T-SQL.
C.
Populate the date dimension view by using T-SQL.
Answers
D.
Populate the date dimension table by using a Copy activity in a pipeline.
D.
Populate the date dimension table by using a Copy activity in a pipeline.
Answers
Suggested answer: A, B

Explanation:

Both a dataflow (A) and a Stored procedure activity in a pipeline (B) are capable of creating and populating a date dimension table. A dataflow can perform the transformation needed to create the date dimension, and it aligns with the preference for using low-code tools for data ingestion when possible. A Stored procedure could be written to generate the necessary date dimension data and executed within a pipeline, which also adheres to the technical requirements for the PoC.

HOTSPOT

You need to design a semantic model for the customer satisfaction report.

Which data source authentication method and mode should you use? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.


Question 10
Correct answer: Question 10
Total 102 questions
Go to page: of 11