ExamGecko
Home Home / Salesforce / Certified Data Architect

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 20

Question list
Search
Search

List of questions

Search

Related questions











The data architect for UC has written a SOQL query that will return all records from the Task object that do not have a value in the WhatID field:

Select id, description, Subject from Task where WhatId!= NULL

When the data architect usages the query to select values for a process a time out error occurs.

What does the data architect need to change to make this query more performant?

A.
Remove description from the requested field set.
A.
Remove description from the requested field set.
Answers
B.
Change query to SOSL.
B.
Change query to SOSL.
Answers
C.
Add limit 100 to the query.
C.
Add limit 100 to the query.
Answers
D.
Change the where clause to filter by a deterministic defined value.
D.
Change the where clause to filter by a deterministic defined value.
Answers
Suggested answer: D

Explanation:

According to the Salesforce documentation, SOQL is a query language that allows querying data from Salesforce objects and fields. SOQL queries have various clauses and operators that can be used to filter and sort the results. However, some clauses and operators can affect the performance of SOQL queries by increasing the cost or complexity of executing them.

To make this query more performant, a data architect should change the where clause to filter by a deterministic defined value (option D). This means using a filter condition that specifies a concrete value or range of values for a field, such as WhatId = '001xx000003DGg3' or WhatId IN ('001xx000003DGg3', '001xx000003DGg4'). This can improve the performance of the query by reducing the number of records that need to be scanned and returned. A deterministic defined value can also leverage an index on the field, which can speed up the query execution.

Removing description from the requested field set (option A) is not a good solution, as it can affect the functionality or usability of the query. The description field may contain important or relevant information that is needed for the process. Changing the query to SOSL (option B) is also not a good solution, as SOSL is a different query language that allows searching text fields across multiple objects. SOSL queries have different syntax and limitations than SOQL queries, and may not return the same results or performance. Adding limit 100 to the query (option C) is also not a good solution, as it can affect the completeness or accuracy of the query. The limit clause specifies the maximum number of records that can be returned by the query, which may not include all the records that match the filter condition.

Universal Containers (UC) is in the process of migrating legacy inventory data from an enterprise resources planning (ERP) system into Sales Cloud with the following requirements:

Legacy inventory data will be stored in a custom child object called Inventory_c.

Inventory data should be related to the standard Account object.

The Inventory object should Invent the same sharing rules as the Account object.

Anytime an Account record is deleted in Salesforce, the related Inventory_c record(s) should be deleted as well.

What type of relationship field should a data architect recommend in this scenario?

A.
Master-detail relationship filed on Account, related to Inventory_c
A.
Master-detail relationship filed on Account, related to Inventory_c
Answers
B.
Master-detail relationship filed on Inventory_c, related to Account
B.
Master-detail relationship filed on Inventory_c, related to Account
Answers
C.
Indirect lookup relationship field on Account, related to Inventory_c
C.
Indirect lookup relationship field on Account, related to Inventory_c
Answers
D.
Lookup relationship fields on Inventory related to Account
D.
Lookup relationship fields on Inventory related to Account
Answers
Suggested answer: B

Explanation:

According to the Salesforce documentation, a relationship field is a field that allows linking one object to another object in Salesforce. There are different types of relationship fields that have different characteristics and behaviors, such as master-detail, lookup, indirect lookup, external lookup, etc.

To recommend a type of relationship field for this scenario, where legacy inventory data will be stored in a custom child object called Inventory__c, inventory data should be related to the standard Account object, the Inventory__c object should inherit the same sharing rules as the Account object, and anytime an Account record is deleted in Salesforce, the related Inventory__c record(s) should be deleted as well, a data architect should recommend:

Master-detail relationship field on Inventory__c, related to Account (option B). This means creating a field on the Inventory__c object that references the Account object as its parent. A master-detail relationship field establishes a parent-child relationship between two objects, where the parent object controls certain behaviors of the child object. For example, a master-detail relationship field can:

Inherit the sharing and security settings from the parent object to the child object. This means that the users who can access and edit the parent record can also access and edit the related child records.

Cascade delete from the parent object to the child object. This means that when a parent record is deleted, all the related child records are also deleted.

Roll up summary fields from the child object to the parent object. This means that the parent object can display aggregated information from the child records, such as count, sum, min, max, or average.

Master-detail relationship field on Account, related to Inventory__c (option A) is not a good solution, as it reverses the direction of the relationship. This means creating a field on the Account object that references the Inventory__c object as its parent. This is not possible, as a standard object cannot be on the detail side of a master-detail relationship. Indirect lookup relationship field on Account, related to Inventory__c (option C) is also not a good solution, as it is a special type of relationship field that allows linking a custom object to a standard object on an external system using an indirect reference. This is not applicable for this scenario, as both objects are in Salesforce and do not need an external reference. Lookup relationship field on Inventory__c related to Account (option D) is also not a good solution, as it establishes a looser relationship between two objects than a master-detail relationship. A lookup relationship field does not inherit sharing and security settings from the parent object to the child object, does not cascade delete from the parent object to the child object, and does not roll up summary fields from the child object to the parent object.

A customer wants to maintain geographic location information including latitude and longitude in a custom object. What would a data architect recommend to satisfy this requirement?

A.
Create formula fields with geolocation function for this requirement.
A.
Create formula fields with geolocation function for this requirement.
Answers
B.
Create custom fields to maintain latitude and longitude information
B.
Create custom fields to maintain latitude and longitude information
Answers
C.
Create a geolocation custom field to maintain this requirement
C.
Create a geolocation custom field to maintain this requirement
Answers
D.
Recommend app exchange packages to support this requirement.
D.
Recommend app exchange packages to support this requirement.
Answers
Suggested answer: C

Explanation:

The correct answer is C, create a geolocation custom field to maintain this requirement. A geolocation custom field is a compound field that can store both latitude and longitude information in a single field. It also supports geolocation functions and distance calculations. Creating formula fields or custom fields for latitude and longitude separately would be inefficient and redundant. Recommending app exchange packages would not be a direct solution to the requirement.

As part of addressing general data protection regulation (GDPR) requirements, UC plans to implement a data classification policy for all its internal systems that stores customer information including salesforce.

What should a data architect recommend so that UC can easily classify consumer information maintained in salesforce under both standard and custom objects?

A.
Use App Exchange products to classify fields based on policy.
A.
Use App Exchange products to classify fields based on policy.
Answers
B.
Use data classification metadata fields available in field definition.
B.
Use data classification metadata fields available in field definition.
Answers
C.
Create a custom picklist field to capture classification of information on customer.
C.
Create a custom picklist field to capture classification of information on customer.
Answers
D.
Build reports for customer information and validate.
D.
Build reports for customer information and validate.
Answers
Suggested answer: B

Explanation:

The correct answer is B, use data classification metadata fields available in field definition. Data classification metadata fields are standard fields that allow you to classify the sensitivity level of your data based on your organization's policies. You can use these fields to indicate whether a field contains confidential, restricted, or general data. These fields are available for both standard and custom objects in Salesforce. Using app exchange products, creating a custom picklist field, or building reports would not be as effective or consistent as using data classification metadata fields.

Northern Trail Outfitters has these simple requirements for a data export process:

File format should be in CSV.

Process should be scheduled and run once per week.

The expert should be configurable through the Salesforce UI.

Which tool should a data architect leverage to accomplish these requirements?

A.
Bulk API
A.
Bulk API
Answers
B.
Data export wizard
B.
Data export wizard
Answers
C.
Third-party ETL tool
C.
Third-party ETL tool
Answers
D.
Data loader
D.
Data loader
Answers
Suggested answer: B

Explanation:

The correct answer is B, data export wizard. The data export wizard is a tool that allows you to export your data in CSV format, schedule the export process to run once per week, and configure the export settings through the Salesforce UI. The data export wizard can handle up to 51 million records per export. The bulk API, third-party ETL tools, and data loader are also tools that can export data, but they are not as simple or user-friendly as the data export wizard.

UC is planning a massive SF implementation with large volumes of dat

a. As part of the org's implementation, several roles, territories, groups, and sharing rules have been configured. The data architect has been tasked with loading all of the required data, including user data, in a timely manner.

What should a data architect do to minimize data load times due to system calculations?

A.
Enable defer sharing calculations, and suspend sharing rule calculations
A.
Enable defer sharing calculations, and suspend sharing rule calculations
Answers
B.
Load the data through data loader, and turn on parallel processing.
B.
Load the data through data loader, and turn on parallel processing.
Answers
C.
Leverage the Bulk API and concurrent processing with multiple batches
C.
Leverage the Bulk API and concurrent processing with multiple batches
Answers
D.
Enable granular locking to avoid ''UNABLE _TO_LOCK_ROW'' error.
D.
Enable granular locking to avoid ''UNABLE _TO_LOCK_ROW'' error.
Answers
Suggested answer: A

Explanation:

The correct answer is A, enable defer sharing calculations, and suspend sharing rule calculations. Defer sharing calculations and suspend sharing rule calculations are features that allow you to temporarily disable the automatic recalculation of sharing rules when you load large volumes of data. This can improve the performance and speed of your data load process by avoiding unnecessary system calculations. Loading the data through data loader, leveraging the bulk API, or enabling granular locking are also options that can help with data load times, but they do not directly address the system calculations issue.

UC has the following system:

Billing system.

Customer support system.

CRM system.

US has been having trouble with business intelligence across the different systems. Recently US implemented a master data management (MDM) solution that will be the system of truth for the customer records.

Which MDM data element is needed to allow reporting across these systems?

A.
Global unique customer number.
A.
Global unique customer number.
Answers
B.
Email address.
B.
Email address.
Answers
C.
Phone number.
C.
Phone number.
Answers
D.
Full name.
D.
Full name.
Answers
Suggested answer: A

Explanation:

The correct answer is A, global unique customer number. A global unique customer number is a data element that can uniquely identify each customer across different systems. It can be used as a key to link customer records from different sources and enable reporting across these systems. Email address, phone number, and full name are not reliable or consistent identifiers for customers, as they can change over time or be shared by multiple customers.

Universal Container is using Salesforce for Opportunity management and enterprise resource planning (ERP) for order management. Sales reps do not have access to the ERP and have no visibility into order status.

What solution a data architect recommends to give the sales team visibility into order status?

A.
Leverage Canvas to bring the order management UI in to the Salesforce tab.
A.
Leverage Canvas to bring the order management UI in to the Salesforce tab.
Answers
B.
Build batch jobs to push order line items to salesforce.
B.
Build batch jobs to push order line items to salesforce.
Answers
C.
leverage Salesforce Connect top bring the order line item from the legacy system to Salesforce.
C.
leverage Salesforce Connect top bring the order line item from the legacy system to Salesforce.
Answers
D.
Build real-time integration to pull order line items into Salesforce when viewing orders.
D.
Build real-time integration to pull order line items into Salesforce when viewing orders.
Answers
Suggested answer: C

Explanation:

The correct answer is C, leverage Salesforce Connect to bring the order line item from the legacy system to Salesforce. Salesforce Connect is a feature that allows you to integrate external data sources with Salesforce and access them in real time without copying or synchronizing the data. This way, the sales team can view the order status from the ERP system without having access to it. Leveraging Canvas, building batch jobs, or building real-time integration are also possible solutions, but they are more complex and costly than using Salesforce Connect.

Universal Containers (UC) is transitioning from Classic to Lightning Experience.

What does UC need to do to ensure users have access to its notices and attachments in Lightning Experience?

A.
Add Notes and Attachments Related List to page Layout in Lighting Experience.
A.
Add Notes and Attachments Related List to page Layout in Lighting Experience.
Answers
B.
Manually upload Notes in Lighting Experience.
B.
Manually upload Notes in Lighting Experience.
Answers
C.
Migrate Notes and Attachment to Enhanced Notes and Files a migration tool
C.
Migrate Notes and Attachment to Enhanced Notes and Files a migration tool
Answers
D.
Manually upload Attachments in Lighting Experience.
D.
Manually upload Attachments in Lighting Experience.
Answers
Suggested answer: C

Explanation:

The correct answer is C, migrate Notes and Attachment to Enhanced Notes and Files using a migration tool. Enhanced Notes and Files are the new features in Lightning Experience that replace the classic Notes and Attachments. They offer more functionality and security than the classic version. To access them in Lightning Experience, you need to migrate your existing Notes and Attachments using a migration tool provided by Salesforce. Adding Notes and Attachments Related List, manually uploading Notes or Attachments, or doing nothing are not valid solutions, as they will not enable you to use the enhanced features in Lightning Experience.

Universal Container has implemented Sales Cloud to manage patient and related health records. During a recent security audit of the system, it was discovered that same standard and custom fields need to encrypted.

Which solution should a data architect recommend to encrypt existing fields?

A.
Use Apex Crypto Class encrypt customer and standard fields.
A.
Use Apex Crypto Class encrypt customer and standard fields.
Answers
B.
Implement classic encryption to encrypt custom and standard fields.
B.
Implement classic encryption to encrypt custom and standard fields.
Answers
C.
Implement shield platform encryption to encrypt and standard fields
C.
Implement shield platform encryption to encrypt and standard fields
Answers
D.
Expert data out of Salesforce and encrypt custom and standard fields.
D.
Expert data out of Salesforce and encrypt custom and standard fields.
Answers
Suggested answer: C

Explanation:

The correct answer is C, implement shield platform encryption to encrypt standard and custom fields. Shield platform encryption is a feature that allows you to encrypt sensitive data at rest in Salesforce without affecting its functionality. You can encrypt both standard and custom fields using shield platform encryption. Using Apex Crypto Class, implementing classic encryption, or exporting data out of Salesforce are not recommended solutions, as they will either limit your functionality, require custom code, or compromise your data security.

Total 260 questions
Go to page: of 26