ExamGecko
Home Home / Microsoft / DP-300

Microsoft DP-300 Practice Test - Questions Answers, Page 9

Question list
Search
Search

List of questions

Search

Related questions











You need to recommend a solution that will enable remote developers to access DB1 and DB2. The solution must support the planned changes and meet the secunty requirements.

What should you include in the recommendation?

A.
a public endpoint via a database-level firewall rule
A.
a public endpoint via a database-level firewall rule
Answers
B.
a private endpoint
B.
a private endpoint
Answers
C.
a public endpoint via a server-level firewall rule
C.
a public endpoint via a server-level firewall rule
Answers
D.
a Point-to-Site (P2S) VPN
D.
a Point-to-Site (P2S) VPN
Answers
Suggested answer: B

You need to recommend a solution to ensure that the performance of DB3 is optimized after the migration to Azure SQL Database. The solution must meet availability requirements.

What should you include in the recommendation?

A.
Resource Governor
A.
Resource Governor
Answers
B.
a custom resource pool
B.
a custom resource pool
Answers
C.
vertical scaling
C.
vertical scaling
Answers
D.
horizontal scaling
D.
horizontal scaling
Answers
Suggested answer: C

You need to recommend a solution to meet the security requirements and the business requirements for DB3. What should you recommend as the first step of the solution?

A.
Run the sys.5p_cdc_enble_db stored procedure.
A.
Run the sys.5p_cdc_enble_db stored procedure.
Answers
B.
Run the alter table statement and specify the enable chahgc_tracking clause.
B.
Run the alter table statement and specify the enable chahgc_tracking clause.
Answers
C.
Run the alter database statement and specify the set cmange_trackinc> - on clause.
C.
Run the alter database statement and specify the set cmange_trackinc> - on clause.
Answers
D.
Run the sp_addarticle stored procedure.
D.
Run the sp_addarticle stored procedure.
Answers
Suggested answer: C

You have an Azure SQL Database managed instance named SQLMI1. A Microsoft SQL Server Agent job runs on SQLMI1. You need to ensure that an automatic email notification is sent once the job completes.

What should you include in the solution?

A.
From SQL Server Configuration Manager (SSMS), enable SQL Server Agent
A.
From SQL Server Configuration Manager (SSMS), enable SQL Server Agent
Answers
B.
From SQL Server Management Studio (SSMS), run sp_set_sqlagent_properties
B.
From SQL Server Management Studio (SSMS), run sp_set_sqlagent_properties
Answers
C.
From SQL Server Management Studio (SSMS), create a Database Mail profile
C.
From SQL Server Management Studio (SSMS), create a Database Mail profile
Answers
D.
From the Azure portal, create an Azure Monitor action group that has an Email/SMS/Push/Voice action
D.
From the Azure portal, create an Azure Monitor action group that has an Email/SMS/Push/Voice action
Answers
Suggested answer: C

Explanation:

To send a notification in response to an alert, you must first configure SQL Server Agent to send mail.

Using SQL Server Management Studio; to configure SQL Server Agent to use Database Mail:

1. In Object Explorer, expand a SQL Server instance.

2. Right-click SQL Server Agent, and then click Properties.

3. Click Alert System.

4. Select Enable Mail Profile.

5. In the Mail system list, select Database Mail.

6. In the Mail profile list, select a mail profile for Database Mail.

7. Restart SQL Server Agent.

Note: Prerequisites include:

Enable Database Mail.

Create a Database Mail account for the SQL Server Agent service account to use.

Create a Database Mail profile for the SQL Server Agent service account to use and add the user to the DatabaseMailUserRole in the msdb database. Set the profile as the default profile for the msdb database.

Reference:

https://docs.microsoft.com/en-us/sql/relational-databases/database-mail/configure-sql-server-agent-mail-to-use-database-mail

DRAG DROP

You have SQL Server on an Azure virtual machine named SQL1.

SQL1 has an agent job to back up all databases.

You add a user named dbadmin1 as a SQL Server Agent operator.

You need to ensure that dbadmin1 receives an email alert if a job fails.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.


Question 85
Correct answer: Question 85

Explanation:

Step 1: Enable the email settings for the SQL Server Agent.

To send a notification in response to an alert, you must first configure SQL Server Agent to send mail.

Step 2: Create a job alert

Step 3: Create a job notification

Example:

-- adds an e-mail notification for the specified alert (Test Alert)

-- This example assumes that Test Alert already exists

-- and that François Ajenstat is a valid operator name.

USE msdb ;

GO

EXEC dbo.sp_add_notification

@alert_name = N'Test Alert',

@operator_name = N'François Ajenstat',

@notification_method = 1 ;

GO

Reference:

https://docs.microsoft.com/en-us/sql/ssms/agent/notify-an-operator-of-job-status

https://docs.microsoft.com/en-us/sql/ssms/agent/assign-alerts-to-an-operator

DRAG DROP

You need to apply 20 built-in Azure Policy definitions to all new and existing Azure SQL Database deployments in an Azure subscription. The solution must minimize administrative effort.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.


Question 86
Correct answer: Question 86

Explanation:

Step 1: Create an Azure Policy Initiative

The first step in enforcing compliance with Azure Policy is to assign a policy definition. A policy definition defines under what condition a policy is enforced and what effect to take.

With an initiative definition, you can group several policy definitions to achieve one overarching goal. An initiative evaluates resources within scope of the assignment for compliance to the included policies.

Step 2: Create an Azure Policy Initiative assignment

Assign the initiative definition you created in the previous step.

Step 3: Run Azure Policy remediation tasks

To apply the Policy Initiative to the existing SQL databases.

Reference:

https://docs.microsoft.com/en-us/azure/governance/policy/tutorials/create-and-manage

You need to trigger an Azure Data Factory pipeline when a file arrives in an Azure Data Lake Storage Gen2 container. Which resource provider should you enable?

A.
Microsoft.EventHub
A.
Microsoft.EventHub
Answers
B.
Microsoft.EventGrid
B.
Microsoft.EventGrid
Answers
C.
Microsoft.Sql
C.
Microsoft.Sql
Answers
D.
Microsoft.Automation
D.
Microsoft.Automation
Answers
Suggested answer: B

Explanation:

Event-driven architecture (EDA) is a common data integration pattern that involves production, detection, consumption, and reaction to events. Data integration scenarios often require Data Factory customers to trigger pipelines based on events happening in storage account, such as the arrival or deletion of a file in Azure Blob Storage account. Data Factory natively integrates with Azure Event Grid, which lets you trigger pipelines on such events.

Reference: https://docs.microsoft.com/en-us/azure/data-factory/how-to-create-event-trigger

You have the following Azure Data Factory pipelines:

Ingest Data from System1

Ingest Data from System2

Populate Dimensions

Populate Facts

Ingest Data from System1 and Ingest Data from System2 have no dependencies. Populate Dimensions must execute after Ingest Data from System1 and Ingest Data from System2. Populate Facts must execute after the Populate Dimensions pipeline. All the pipelines must execute every eight hours.

What should you do to schedule the pipelines for execution?

A.
Add a schedule trigger to all four pipelines.
A.
Add a schedule trigger to all four pipelines.
Answers
B.
Add an event trigger to all four pipelines.
B.
Add an event trigger to all four pipelines.
Answers
C.
Create a parent pipeline that contains the four pipelines and use an event trigger.
C.
Create a parent pipeline that contains the four pipelines and use an event trigger.
Answers
D.
Create a parent pipeline that contains the four pipelines and use a schedule trigger.
D.
Create a parent pipeline that contains the four pipelines and use a schedule trigger.
Answers
Suggested answer: D

Explanation:

Reference: https://www.mssqltips.com/sqlservertip/6137/azure-data-factory-control-flow-activities-overview/

You have an Azure Data Factory pipeline that performs an incremental load of source data to an Azure Data Lake Storage Gen2 account.

Data to be loaded is identified by a column named LastUpdatedDate in the source table.

You plan to execute the pipeline every four hours.

You need to ensure that the pipeline execution meets the following requirements:

Automatically retries the execution when the pipeline run fails due to concurrency or throttling limits. Supports backfilling existing data in the table.

Which type of trigger should you use?

A.
tumbling window
A.
tumbling window
Answers
B.
on-demand
B.
on-demand
Answers
C.
event
C.
event
Answers
D.
schedule
D.
schedule
Answers
Suggested answer: A

Explanation:

The Tumbling window trigger supports backfill scenarios. Pipeline runs can be scheduled for windows in the past. Incorrect Answers:

D: Schedule trigger does not support backfill scenarios. Pipeline runs can be executed only on time periods from the current time and the future.

Reference: https://docs.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers

You have an Azure Data Factory that contains 10 pipelines.

You need to label each pipeline with its main purpose of either ingest, transform, or load. The labels must be available for grouping and filtering when using the monitoring experience in Data Factory. What should you add to each pipeline?

A.
an annotation
A.
an annotation
Answers
B.
a resource tag
B.
a resource tag
Answers
C.
a run group ID
C.
a run group ID
Answers
D.
a user property
D.
a user property
Answers
E.
a correlation ID
E.
a correlation ID
Answers
Suggested answer: A

Explanation:

Azure Data Factory annotations help you easily filter different Azure Data Factory objects based on a tag. You can define tags so you can see their performance or find errors faster.

Reference: https://www.techtalkcorner.com/monitor-azure-data-factory-annotations/

Total 338 questions
Go to page: of 34