ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 236 - DP-203 discussion

Report
Export

You have an Azure Data Factory pipeline named Pipeline1!. Pipelinel contains a copy activity that sends data to an Azure Data Lake Storage Gen2 account. Pipeline 1 is executed by a schedule trigger. You change the copy activity sink to a new storage account and merge the changes into the collaboration branch. After Pipelinel executes, you discover that data is NOT copied to the new storage account. You need to ensure that the data is copied to the new storage account. What should you do?

A.
Publish from the collaboration branch.
Answers
A.
Publish from the collaboration branch.
B.
Configure the change feed of the new storage account.
Answers
B.
Configure the change feed of the new storage account.
C.
Create a pull request.
Answers
C.
Create a pull request.
D.
Modify the schedule trigger.
Answers
D.
Modify the schedule trigger.
Suggested answer: A

Explanation:

CI/CD lifecycle

A development data factory is created and configured with Azure Repos Git. All developers should have permission to author Data Factory resources like pipelines and datasets. A developer creates a feature branch to make a change. They debug their pipeline runs with their most recent changes After a developer is satisfied with their changes, they create a pull request from their feature branch to the main or collaboration branch to get their changes reviewed by peers.

After a pull request is approved and changes are merged in the main branch, the changes get published to the development factory.

Reference: https://docs.microsoft.com/en-us/azure/data-factory/continuous-integration-delivery

asked 02/10/2024
Matthew Sain
39 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first