ExamGecko
Home / Microsoft / DP-300 / List of questions
Ask Question

Microsoft DP-300 Practice Test - Questions Answers, Page 9

List of questions

Question 81

Report
Export
Collapse

You need to recommend a solution that will enable remote developers to access DB1 and DB2. The solution must support the planned changes and meet the secunty requirements.

What should you include in the recommendation?

a public endpoint via a database-level firewall rule
a public endpoint via a database-level firewall rule
a private endpoint
a private endpoint
a public endpoint via a server-level firewall rule
a public endpoint via a server-level firewall rule
a Point-to-Site (P2S) VPN
a Point-to-Site (P2S) VPN
Suggested answer: B
asked 02/10/2024
Tanner Blair
31 questions

Question 82

Report
Export
Collapse

You need to recommend a solution to ensure that the performance of DB3 is optimized after the migration to Azure SQL Database. The solution must meet availability requirements.

What should you include in the recommendation?

Resource Governor
Resource Governor
a custom resource pool
a custom resource pool
vertical scaling
vertical scaling
horizontal scaling
horizontal scaling
Suggested answer: C
asked 02/10/2024
AHOPkos Varga
29 questions

Question 83

Report
Export
Collapse

You need to recommend a solution to meet the security requirements and the business requirements for DB3. What should you recommend as the first step of the solution?

Run the sys.5p_cdc_enble_db stored procedure.
Run the sys.5p_cdc_enble_db stored procedure.
Run the alter table statement and specify the enable chahgc_tracking clause.
Run the alter table statement and specify the enable chahgc_tracking clause.
Run the alter database statement and specify the set cmange_trackinc> - on clause.
Run the alter database statement and specify the set cmange_trackinc> - on clause.
Run the sp_addarticle stored procedure.
Run the sp_addarticle stored procedure.
Suggested answer: C
asked 02/10/2024
Pamela Joanne Ang
31 questions

Question 84

Report
Export
Collapse

You have an Azure SQL Database managed instance named SQLMI1. A Microsoft SQL Server Agent job runs on SQLMI1. You need to ensure that an automatic email notification is sent once the job completes.

What should you include in the solution?

From SQL Server Configuration Manager (SSMS), enable SQL Server Agent
From SQL Server Configuration Manager (SSMS), enable SQL Server Agent
From SQL Server Management Studio (SSMS), run sp_set_sqlagent_properties
From SQL Server Management Studio (SSMS), run sp_set_sqlagent_properties
From SQL Server Management Studio (SSMS), create a Database Mail profile
From SQL Server Management Studio (SSMS), create a Database Mail profile
From the Azure portal, create an Azure Monitor action group that has an Email/SMS/Push/Voice action
From the Azure portal, create an Azure Monitor action group that has an Email/SMS/Push/Voice action
Suggested answer: C

Explanation:

To send a notification in response to an alert, you must first configure SQL Server Agent to send mail.

Using SQL Server Management Studio; to configure SQL Server Agent to use Database Mail:

1. In Object Explorer, expand a SQL Server instance.

2. Right-click SQL Server Agent, and then click Properties.

3. Click Alert System.

4. Select Enable Mail Profile.

5. In the Mail system list, select Database Mail.

6. In the Mail profile list, select a mail profile for Database Mail.

7. Restart SQL Server Agent.

Note: Prerequisites include:

Enable Database Mail.

Create a Database Mail account for the SQL Server Agent service account to use.

Create a Database Mail profile for the SQL Server Agent service account to use and add the user to the DatabaseMailUserRole in the msdb database. Set the profile as the default profile for the msdb database.

Reference:

https://docs.microsoft.com/en-us/sql/relational-databases/database-mail/configure-sql-server-agent-mail-to-use-database-mail

asked 02/10/2024
Simone Perego
42 questions

Question 85

Report
Export
Collapse

DRAG DROP

You have SQL Server on an Azure virtual machine named SQL1.

SQL1 has an agent job to back up all databases.

You add a user named dbadmin1 as a SQL Server Agent operator.

You need to ensure that dbadmin1 receives an email alert if a job fails.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.


Microsoft DP-300 image Question 85 89925 10022024015907000
Correct answer: Microsoft DP-300 image answer Question 85 89925 10022024015907000

Explanation:

Step 1: Enable the email settings for the SQL Server Agent.

To send a notification in response to an alert, you must first configure SQL Server Agent to send mail.

Step 2: Create a job alert

Step 3: Create a job notification

Example:

-- adds an e-mail notification for the specified alert (Test Alert)

-- This example assumes that Test Alert already exists

-- and that François Ajenstat is a valid operator name.

USE msdb ;

GO

EXEC dbo.sp_add_notification

@alert_name = N'Test Alert',

@operator_name = N'François Ajenstat',

@notification_method = 1 ;

GO

Reference:

https://docs.microsoft.com/en-us/sql/ssms/agent/notify-an-operator-of-job-status

https://docs.microsoft.com/en-us/sql/ssms/agent/assign-alerts-to-an-operator

asked 02/10/2024
Juan Garrido Soler
32 questions

Question 86

Report
Export
Collapse

DRAG DROP

You need to apply 20 built-in Azure Policy definitions to all new and existing Azure SQL Database deployments in an Azure subscription. The solution must minimize administrative effort.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.


Microsoft DP-300 image Question 86 89926 10022024015907000
Correct answer: Microsoft DP-300 image answer Question 86 89926 10022024015907000

Explanation:

Step 1: Create an Azure Policy Initiative

The first step in enforcing compliance with Azure Policy is to assign a policy definition. A policy definition defines under what condition a policy is enforced and what effect to take.

With an initiative definition, you can group several policy definitions to achieve one overarching goal. An initiative evaluates resources within scope of the assignment for compliance to the included policies.

Step 2: Create an Azure Policy Initiative assignment

Assign the initiative definition you created in the previous step.

Step 3: Run Azure Policy remediation tasks

To apply the Policy Initiative to the existing SQL databases.

Reference:

https://docs.microsoft.com/en-us/azure/governance/policy/tutorials/create-and-manage

asked 02/10/2024
JED MEDIA
37 questions

Question 87

Report
Export
Collapse

You need to trigger an Azure Data Factory pipeline when a file arrives in an Azure Data Lake Storage Gen2 container. Which resource provider should you enable?

Microsoft.EventHub
Microsoft.EventHub
Microsoft.EventGrid
Microsoft.EventGrid
Microsoft.Sql
Microsoft.Sql
Microsoft.Automation
Microsoft.Automation
Suggested answer: B

Explanation:

Event-driven architecture (EDA) is a common data integration pattern that involves production, detection, consumption, and reaction to events. Data integration scenarios often require Data Factory customers to trigger pipelines based on events happening in storage account, such as the arrival or deletion of a file in Azure Blob Storage account. Data Factory natively integrates with Azure Event Grid, which lets you trigger pipelines on such events.

Reference: https://docs.microsoft.com/en-us/azure/data-factory/how-to-create-event-trigger

asked 02/10/2024
PRABHAT VAIBHAV
29 questions

Question 88

Report
Export
Collapse

You have the following Azure Data Factory pipelines:

Ingest Data from System1

Ingest Data from System2

Populate Dimensions

Populate Facts

Ingest Data from System1 and Ingest Data from System2 have no dependencies. Populate Dimensions must execute after Ingest Data from System1 and Ingest Data from System2. Populate Facts must execute after the Populate Dimensions pipeline. All the pipelines must execute every eight hours.

What should you do to schedule the pipelines for execution?

Add a schedule trigger to all four pipelines.
Add a schedule trigger to all four pipelines.
Add an event trigger to all four pipelines.
Add an event trigger to all four pipelines.
Create a parent pipeline that contains the four pipelines and use an event trigger.
Create a parent pipeline that contains the four pipelines and use an event trigger.
Create a parent pipeline that contains the four pipelines and use a schedule trigger.
Create a parent pipeline that contains the four pipelines and use a schedule trigger.
Suggested answer: D

Explanation:

Reference: https://www.mssqltips.com/sqlservertip/6137/azure-data-factory-control-flow-activities-overview/

asked 02/10/2024
Anne Grethe Knutsen
30 questions

Question 89

Report
Export
Collapse

You have an Azure Data Factory pipeline that performs an incremental load of source data to an Azure Data Lake Storage Gen2 account.

Data to be loaded is identified by a column named LastUpdatedDate in the source table.

You plan to execute the pipeline every four hours.

You need to ensure that the pipeline execution meets the following requirements:

Automatically retries the execution when the pipeline run fails due to concurrency or throttling limits. Supports backfilling existing data in the table.

Which type of trigger should you use?

tumbling window
tumbling window
on-demand
on-demand
event
event
schedule
schedule
Suggested answer: A

Explanation:

The Tumbling window trigger supports backfill scenarios. Pipeline runs can be scheduled for windows in the past. Incorrect Answers:

D: Schedule trigger does not support backfill scenarios. Pipeline runs can be executed only on time periods from the current time and the future.

Reference: https://docs.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers

asked 02/10/2024
Carson Plunkett
50 questions

Question 90

Report
Export
Collapse

You have an Azure Data Factory that contains 10 pipelines.

You need to label each pipeline with its main purpose of either ingest, transform, or load. The labels must be available for grouping and filtering when using the monitoring experience in Data Factory. What should you add to each pipeline?

an annotation
an annotation
a resource tag
a resource tag
a run group ID
a run group ID
a user property
a user property
a correlation ID
a correlation ID
Suggested answer: A

Explanation:

Azure Data Factory annotations help you easily filter different Azure Data Factory objects based on a tag. You can define tags so you can see their performance or find errors faster.

Reference: https://www.techtalkcorner.com/monitor-azure-data-factory-annotations/

asked 02/10/2024
Jim Balkwill
44 questions
Total 342 questions
Go to page: of 35
Search

Related questions