ExamGecko
Home Home / Microsoft / DP-300

Microsoft DP-300 Practice Test - Questions Answers, Page 13

Question list
Search
Search

List of questions

Search

Related questions











HOTSPOT

You plan to migrate on-premises Microsoft SQL Server databases to Azure.

You need to identify which deployment and resiliency options meet the following requirements:

Support user-initiated backups.

Support multiple automatically replicated instances across Azure regions.

Minimize administrative effort to implement and maintain business continuity.

What should you identify? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.


Question 121
Correct answer: Question 121

Explanation:

Box 1: SQL Server on Azure VMs

SQL Server on Azure Virtual Machines can take advantage of Automated Backup, which regularly creates backups of your database to blob storage. You can also manually use this technique.

Box 2: Active geo-replication

Geo-replication for services such as Azure SQL Database and Cosmos DB will create secondary replicas of your data across multiple regions. While both services will automatically replicate data within the same region, geo-replication protects you against a regional outage by enabling you to fail over to a secondary region.

Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/sql-server-on-azure-vm-iaas-what-is-overview

https://docs.microsoft.com/en-us/dotnet/architecture/cloud-native/infrastructure-resiliency-azure

You need to design a data retention solution for the Twitter feed data records. The solution must meet the customer sentiment analytics requirements.

Which Azure Storage functionality should you include in the solution?

A.
time-based retention
A.
time-based retention
Answers
B.
change feed
B.
change feed
Answers
C.
lifecycle management
C.
lifecycle management
Answers
D.
soft delete
D.
soft delete
Answers
Suggested answer: C

Explanation:

The lifecycle management policy lets you:

Delete blobs, blob versions, and blob snapshots at the end of their lifecycles

Scenario:

Purge Twitter feed data records that are older than two years.

Store Twitter feeds in Azure Storage by using Event Hubs Capture. The feeds will be converted into Parquet files. Minimize administrative effort to maintain the Twitter feed data records.

Incorrect Answers:

A: Time-based retention policy support: Users can set policies to store data for a specified interval. When a time-based retention policy is set, blobs can be created and read, but not modified or deleted. After the retention period has expired, blobs can be deleted but not overwritten.

Reference:

https://docs.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts

You need to implement the surrogate key for the retail store table. The solution must meet the sales transaction dataset requirements.

What should you create?

A.
a table that has a FOREIGN KEY constraint
A.
a table that has a FOREIGN KEY constraint
Answers
B.
a table the has an IDENTITY property
B.
a table the has an IDENTITY property
Answers
C.
a user-defined SEQUENCE object
C.
a user-defined SEQUENCE object
Answers
D.
a system-versioned temporal table
D.
a system-versioned temporal table
Answers
Suggested answer: B

Explanation:

Scenario: Contoso requirements for the sales transaction dataset include:

Implement a surrogate key to account for changes to the retail store addresses.

A surrogate key on a table is a column with a unique identifier for each row. The key is not generated from the table data. Data modelers like to create surrogate keys on their tables when they design data warehouse models. You can use the IDENTITY property to achieve this goal simply and effectively without affecting load performance.

Reference:

https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-tables-identity

HOTSPOT

You need to design an analytical storage solution for the transactional data. The solution must meet the sales transaction dataset requirements.

What should you include in the solution? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.


Question 124
Correct answer: Question 124

Explanation:

Box 1: Hash

Scenario:

Ensure that queries joining and filtering sales transaction records based on product ID complete as quickly as possible.

A hash distributed table can deliver the highest query performance for joins and aggregations on large tables.

Box 2: Round-robin

Scenario:

You plan to create a promotional table that will contain a promotion ID. The promotion ID will be associated to a specific product. The product will be identified by a product ID. The table will be approximately 5 GB.

A round-robin table is the most straightforward table to create and delivers fast performance when used as a staging table for loads. These are some scenarios where you should choose Round robin distribution:

When you cannot identify a single key to distribute your data.

If your data doesn’t frequently join with data from other tables.

When there are no obvious keys to join.

Incorrect Answers:

Replicated: Replicated tables eliminate the need to transfer data across compute nodes by replicating a full copy of the data of the specified table to each compute node. The best candidates for replicated tables are tables with sizes less than 2 GB compressed and small dimension tables.

Reference:

https://rajanieshkaushikk.com/2020/09/09/how-to-choose-right-data-distribution-strategy-for-azure-synapse/

You have SQL Server on an Azure virtual machine that contains a database named DB1. DB1 contains a table named CustomerPII. You need to record whenever users query the CustomerPII table.

Which two options should you enable? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

A.
server audit specification
A.
server audit specification
Answers
B.
SQL Server audit
B.
SQL Server audit
Answers
C.
database audit specification
C.
database audit specification
Answers
D.
a server principal
D.
a server principal
Answers
Suggested answer: A, C

Explanation:

An auditing policy can be defined for a specific database or as a default server policy in Azure (which hosts SQL Database or Azure Synapse):

A server policy applies to all existing and newly created databases on the server.

If server auditing is enabled, it always applies to the database. The database will be audited, regardless of the database auditing settings. Enabling auditing on the database, in addition to enabling it on the server, does not override or change any of the settings of the server auditing. Both audits will exist side by side.

Note:

The Server Audit Specification object belongs to an audit.

A Database Audit Specification defines which Audit Action Groups will be audited for the specific database in which the specification is created.

Reference:

https://docs.microsoft.com/en-us/azure/azure-sql/database/auditing-overview

You have an Azure virtual machine based on a custom image named VM1.

VM1 hosts an instance of Microsoft SQL Server 2019 Standard.

You need to automate the maintenance of VM1 to meet the following requirements:

Automate the patching of SQL Server and Windows Server.

Automate full database backups and transaction log backups of the databases on VM1.

Minimize administrative effort.

What should you do first?

A.
Enable a system-assigned managed identity for VM1
A.
Enable a system-assigned managed identity for VM1
Answers
B.
Register VM1 to the Microsoft.Sql resource provider
B.
Register VM1 to the Microsoft.Sql resource provider
Answers
C.
Install an Azure virtual machine Desired State Configuration (DSC) extension on VM1
C.
Install an Azure virtual machine Desired State Configuration (DSC) extension on VM1
Answers
D.
Register VM1 to the Microsoft.SqlVirtualMachine resource provider
D.
Register VM1 to the Microsoft.SqlVirtualMachine resource provider
Answers
Suggested answer: D

Explanation:


Reference: https://docs.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/sql-server-iaas-agent-extension-automate-management

You receive numerous alerts from Azure Monitor for an Azure SQL database.

You need to reduce the number of alerts. You must only receive alerts if there is a significant change in usage patterns for an extended period. Which two actions should you perform? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

A.
Set Threshold Sensitivity to High
A.
Set Threshold Sensitivity to High
Answers
B.
Set the Alert logic threshold to Dynamic
B.
Set the Alert logic threshold to Dynamic
Answers
C.
Set the Alert logic threshold to Static
C.
Set the Alert logic threshold to Static
Answers
D.
Set Threshold Sensitivity to Low
D.
Set Threshold Sensitivity to Low
Answers
E.
Set Force Plan to On
E.
Set Force Plan to On
Answers
Suggested answer: B, D

Explanation:

B: Dynamic Thresholds continuously learns the data of the metric series and tries to model it using a set of algorithms and methods. It detects patterns in the data such as seasonality (Hourly / Daily / Weekly), and is able to handle noisy metrics (such as machine CPU or memory) as well as metrics with low dispersion (such as availability and error rate). D: Alert threshold sensitivity is a high-level concept that controls the amount of deviation from metric behavior required to trigger an alert. Low - The thresholds will be loose with more distance from metric series pattern. An alert rule will only trigger on large deviations, resulting in fewer alerts. Incorrect Answers:

A: High - The thresholds will be tight and close to the metric series pattern. An alert rule will be triggered on the smallest deviation, resulting in more alerts.

Reference: https://docs.microsoft.com/en-us/azure/azure-monitor/platform/alerts-dynamic-thresholds

You have an Azure SQL database named sqldb1.

You need to minimize the amount of space by the data and log files of sqldb1.

What should you run?

A.
DBCC SHRINKDATABASE
A.
DBCC SHRINKDATABASE
Answers
B.
sp_clean_db_free_space
B.
sp_clean_db_free_space
Answers
C.
sp_clean_db_file_free_space
C.
sp_clean_db_file_free_space
Answers
D.
DBCC SHRINKFILE
D.
DBCC SHRINKFILE
Answers
Suggested answer: D

Explanation:

DBCC SHRINKDATABASE shrinks the size of the data and log files in the specified database.

Incorrect Answers:

D: To shrink one data or log file at a time for a specific database, execute the DBCC SHRINKFILE command.

Reference: https://docs.microsoft.com/en-us/sql/t-sql/database-console-commands/dbcc-shrinkdatabase-transact-sql

You have an Azure SQL Database server named sqlsrv1 that hosts 10 Azure SQL databases.

The databases perform slower than expected.

You need to identify whether the performance issue relates to the use of tempdb by Azure SQL databases on sqlsrv1. What should you do?

A.
Run Query Store-based queries
A.
Run Query Store-based queries
Answers
B.
Review information provided by SQL Server Profiler-based traces
B.
Review information provided by SQL Server Profiler-based traces
Answers
C.
Review information provided by Query Performance Insight
C.
Review information provided by Query Performance Insight
Answers
D.
Run dynamic management view-based queries
D.
Run dynamic management view-based queries
Answers
Suggested answer: C

You have an Azure SQL database named sqldb1.

You need to minimize the possibility of Query Store transitioning to a read-only state.

What should you do?

A.
Halve the value of Data Flush Interval.
A.
Halve the value of Data Flush Interval.
Answers
B.
Double the value of Statistics Collection Interval.
B.
Double the value of Statistics Collection Interval.
Answers
C.
Halve the value of Statistics Collection Interval
C.
Halve the value of Statistics Collection Interval
Answers
D.
Double the value of Data Flush Interval.
D.
Double the value of Data Flush Interval.
Answers
Suggested answer: B

Explanation:


Total 338 questions
Go to page: of 34