ExamGecko
Home / Microsoft / DP-700 / List of questions
Ask Question

Microsoft DP-700 Practice Test - Questions Answers, Page 9

Add to Whishlist

List of questions

Question 81

Report Export Collapse

HOTSPOT

You have a Fabric workspace that contains a lakehouse named Lakehousel. Lakehousel contains a table named Status_Target that has the following columns:

* Key

* Status

* LastModified

The data source contains a table named Status.Source that has the same columns as Status_Target. Status.Source is used to populate Status_Target. In a notebook name Notebook!, you load Status_Source to a DataFrame named sourceDF and Status_Target to a DataFrame named targetDF. You need to implement an incremental loading pattern by using Notebook-!. The solution must meet the following requirements:

* For all the matching records that have the same value of key, update the value of LastModified in Status_Target to the value of LastModified in Status_Source.

* Insert all the records that exist in Status_Source that do NOT exist in Status_Target.

* Set the value of Status in Status_Target to inactive for all the records that were last modified more than seven days ago and that do NOT exist in Status.Source.

How should you complete the statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Become a Premium Member for full access
  Unlock Premium Member

Question 82

Report Export Collapse

DRAG DROP

You are building a data loading pattern by using a Fabric data pipeline. The source is an Azure SQL database that contains 25 tables. The destination is a lakehouse.

In a warehouse, you create a control table named Control.Object as shown in the exhibit. (Click the Exhibit tab.)

You need to build a data pipeline that will support the dynamic ingestion of the tables listed in the control table by using a single execution.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.


Become a Premium Member for full access
  Unlock Premium Member

Question 83

Report Export Collapse

You are implementing a medallion architecture in a Fabric lakehouse.

You plan to create a dimension table that will contain the following columns:

* ID

* CustomerCode

* CustomerName

* CustomerAddress

* CustomerLocation

* ValidFrom

* ValidTo

You need to ensure that the table supports the analysis of historical sales data by customer location at the time of each sale Which type of slowly changing dimension (SCD) should you use?

Become a Premium Member for full access
  Unlock Premium Member

Question 84

Report Export Collapse

You have a Fabric workspace that contains an eventstream named EventStreaml. EventStreaml outputs events to a table named Tablel in a lakehouse. The streaming data is souiced from motorway sensors and represents the speed of cars.

You need to add a transformation to EventStream1 to average the car speeds. The speeds must be grouped by non-overlapping and contiguous time intervals of one minute. Each event must belong to exactly one window.

Which windowing function should you use?

Become a Premium Member for full access
  Unlock Premium Member

Question 85

Report Export Collapse

HOTSPOT

You have a table in a Fabric lakehouse that contains the following data.

Microsoft DP-700 image Question 85 63879391759810053671808

You have a notebook that contains the following code segment.

Microsoft DP-700 image Question 85 63879391759810053671808

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

NOTE: Each correct selection is worth one point.


Become a Premium Member for full access
  Unlock Premium Member

Question 86

Report Export Collapse

You have a Fabric workspace named Workspace1. Your company acquires GitHub licenses.

You need to configure source control for Workpace1 to use GitHub. The solution must follow the principle of least privilege. Which permissions do you require to ensure that you can commit code to GitHub?

Become a Premium Member for full access
  Unlock Premium Member

Question 87

Report Export Collapse

You have an Azure key vault named KeyVaultl that contains secrets.

You have a Fabric workspace named Workspace-!. Workspace! contains a notebook named Notebookl that performs the following tasks:

* Loads stage data to the target tables in a lakehouse

* Triggers the refresh of a semantic model

You plan to add functionality to Notebookl that will use the Fabric API to monitor the semantic model refreshes. You need to retrieve the registered application ID and secret from KeyVaultl to generate the authentication token.

Solution: You use the following code segment:

Use notebookutils.credentials.getSecret and specify the key vault URL and key vault secret. Does this meet the goal?

Become a Premium Member for full access
  Unlock Premium Member

Question 88

Report Export Collapse

You have an Azure key vault named KeyVaultl that contains secrets.

You have a Fabric workspace named Workspace!. Workspace! contains a notebook named Notebookl that performs the following tasks:

* Loads stage data to the target tables in a lakehouse

* Triggers the refresh of a semantic model

You plan to add functionality to Notebookl that will use the Fabric API to monitor the semantic model refreshes. You need to retrieve the registered application ID and secret from KeyVaultl to generate the authentication token. Solution: You use the following code segment:

Use notebookutils. credentials.getSecret and specify key vault URL and the name of a linked service.

Does this meet the goal?

Become a Premium Member for full access
  Unlock Premium Member

Question 89

Report Export Collapse

You need to develop an orchestration solution in fabric that will load each item one after the other. The solution must be scheduled to run every 15 minutes. Which type of item should you use?

Become a Premium Member for full access
  Unlock Premium Member

Question 90

Report Export Collapse

You have a Fabric workspace that contains a lakehouse named Lakehousel.

You plan to create a data pipeline named Pipeline! to ingest data into Lakehousel. You will use a parameter named paraml to pass an external value into Pipeline1!. The paraml parameter has a data type of int

You need to ensure that the pipeline expression returns param1 as an int value.

How should you specify the parameter value?

Become a Premium Member for full access
  Unlock Premium Member
Total 100 questions
Go to page: of 10
Search

Related questions