ExamGecko
Home Home / Salesforce / Certified Data Architect

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 22

Question list
Search
Search

List of questions

Search

Related questions











Universal Containers (UC) is in the process of selling half of its company. As part of this split, UC's main Salesforce org will be divided into two org:org A and org B, UC has delivered these requirements to its data architect

1. The data model for Org B will drastically change with different objects, fields, and picklist values.

2. Three million records will need to be migrated from org A to org B for compliance reasons.

3. The migrate will need occur within the next two month, prior to be split.

Which migrate strategy should a data architect use to successfully migrate the date?

A.
use as ETL tool to orchestrate the migration.
A.
use as ETL tool to orchestrate the migration.
Answers
B.
Use Data Loader for export and Data Import Wizard for import
B.
Use Data Loader for export and Data Import Wizard for import
Answers
C.
Write a script to use the Bulk API
C.
Write a script to use the Bulk API
Answers
D.
Use the Salesforces CLI to query, export, and import
D.
Use the Salesforces CLI to query, export, and import
Answers
Suggested answer: A

Explanation:

Using an ETL tool to orchestrate the migration is the best strategy for this scenario, as it can handle the data model changes, the large volume of records, and the tight timeline. Writing a script to use the Bulk API (option C) is also possible, but it would require more coding and testing effort. Using Data Loader and Data Import Wizard (option B) is not suitable for migrating three million records, as they have limitations on the batch size and the number of records per operation.Using Salesforce CLI (option D) is also not recommended for large data migration, as it is mainly designed for development and testing purposes

North Trail Outfitters (NTD) is in the process of evaluating big objects to store large amounts of asset data from an external system. NTO will need to report on this asset data weekly.

Which two native tools should a data architect recommend to achieve this reporting requirement?

A.
Standard reports and dashboards
A.
Standard reports and dashboards
Answers
B.
Async SOQL with a custom object
B.
Async SOQL with a custom object
Answers
C.
Standard SOQL queries
C.
Standard SOQL queries
Answers
D.
Einstein Analytics
D.
Einstein Analytics
Answers
Suggested answer: B, D

Explanation:

Async SOQL with a custom object (option B) and Einstein Analytics (option D) are the two native tools that can be used to report on big object data. Async SOQL allows querying big object data and storing the results in a custom object, which can then be used for reporting. Einstein Analytics can connect to big object data sources and provide advanced analytics and visualization features.Standard reports and dashboards (option A) and standard SOQL queries (option C) cannot be used to report on big object data, as they do not support big object fields

UC is preparing to implement sales cloud and would like to its users to have read only access to an account record if they have access to its child opportunity record. How would a data architect implement this sharing requirement between objects?

A.
Create a criteria-based sharing rule.
A.
Create a criteria-based sharing rule.
Answers
B.
Implicit sharing will automatically handle with standard functionality.
B.
Implicit sharing will automatically handle with standard functionality.
Answers
C.
Add appropriate users to the account team.
C.
Add appropriate users to the account team.
Answers
D.
Create an owner-based sharing rule.
D.
Create an owner-based sharing rule.
Answers
Suggested answer: B

Explanation:

Implicit sharing will automatically handle this sharing requirement with standard functionality, as it grants read-only access to parent accounts when users have access to child opportunities.This is also known as account-opportunity sharing3. Creating a criteria-based sharing rule (option A) or an owner-based sharing rule (option D) is not necessary, as they are used to grant additional access based on record criteria or ownership. Adding appropriate users to the account team (option C) is also not required, as it is used to grant access to specific users or groups for individual accounts.

Universal Container has a Sales Cloud implementation for a sales team and an enterprise resource planning (ERP) as a customer master Sales team are complaining about duplicate account and data quality issues with account data.

Which two solutions should a data architect recommend to resolve the complaints?

A.
Build a nightly batch job to de-dupe data, and merge account records.
A.
Build a nightly batch job to de-dupe data, and merge account records.
Answers
B.
Integrate Salesforce with ERP, and make ERP as system of truth.
B.
Integrate Salesforce with ERP, and make ERP as system of truth.
Answers
C.
Build a nightly sync job from ERP to Salesforce.
C.
Build a nightly sync job from ERP to Salesforce.
Answers
D.
Implement a de-dupe solution and establish account ownership in Salesforce
D.
Implement a de-dupe solution and establish account ownership in Salesforce
Answers
Suggested answer: B, D

Explanation:

Integrating Salesforce with ERP and making ERP the system of truth (option B) and implementing a de-dupe solution and establishing account ownership in Salesforce (option D) are the two solutions that a data architect should recommend to resolve the complaints. Option B ensures that account data is consistent and accurate across both systems, while option D prevents duplicate records and clarifies who owns each account in Salesforce. Building a nightly batch job to de-dupe data and merge account records (option A) is not a good solution, as it does not address the root cause of the duplication and may result in data loss or conflicts. Building a nightly sync job from ERP to Salesforce (option C) is also not sufficient, as it does not prevent duplication or establish ownership in Salesforce.

Northern Trail Outfitters (NTO) has a variety of customers that include householder, businesses, and individuals.

The following conditions exist within its system:

NTO has a total of five million customers.

Duplicate records exist, which is replicated across many systems, including Salesforce.

Given these conditions, there is a lack of consistent presentation and clear identification of a customer record.

Which three option should a data architect perform to resolve the issues with the customer data?

A.
Create a unique global customer ID for each customer and store that in all system for referential identity.
A.
Create a unique global customer ID for each customer and store that in all system for referential identity.
Answers
B.
Use Salesforce CDC to sync customer data cross all systems to keep customer record in sync.
B.
Use Salesforce CDC to sync customer data cross all systems to keep customer record in sync.
Answers
C.
Invest in data duplicate tool to de-dupe and merge duplicate records across all systems.
C.
Invest in data duplicate tool to de-dupe and merge duplicate records across all systems.
Answers
D.
Duplicate customer records across the system and provide a two-way sync of data between the systems.
D.
Duplicate customer records across the system and provide a two-way sync of data between the systems.
Answers
E.
Create a customer master database external to Salesforce as a system of truth and sync the customer data with all systems.
E.
Create a customer master database external to Salesforce as a system of truth and sync the customer data with all systems.
Answers
Suggested answer: A, C, E

Explanation:

Creating a unique global customer ID for each customer and storing that in all systems for referential identity (option A), investing in a data duplicate tool to de-dupe and merge duplicate records across all systems (option C), and creating a customer master database external to Salesforce as a system of truth and syncing the customer data with all systems (option E) are the three options that a data architect should perform to resolve the issues with the customer data. Option A ensures that each customer can be uniquely identified across different systems, option C eliminates duplicate records and improves data quality, and option E provides a consistent and reliable source of customer data for all systems. Using Salesforce CDC to sync customer data across all systems (option B) is not a good option, as it does not address the duplication or inconsistency issues. Duplicating customer records across the system and providing a two-way sync of data between the systems (option D) is also not a good option, as it may create more confusion and conflicts with customer data.

Universal Containers (UC) is implementing Salesforce and will be using Salesforce to track customer complaints, provide white papers on products, and provide subscription-based support.

Which license type will UC users need to fulfill UC's requirements?

A.
Sales Cloud License
A.
Sales Cloud License
Answers
B.
Lightning Platform Starter License
B.
Lightning Platform Starter License
Answers
C.
Service Cloud License
C.
Service Cloud License
Answers
D.
Salesforce License
D.
Salesforce License
Answers
Suggested answer: C

Explanation:

Service Cloud License (option C) is the license type that UC users need to fulfill UC's requirements, as it allows them to track customer complaints, provide white papers on products, and provide subscription-based support. Sales Cloud License (option A) is mainly for managing sales processes and leads, Lightning Platform Starter License (option B) is for building custom apps and workflows, and Salesforce License (option D) is a generic term that does not specify a particular license type.

A large automobile company has implemented Salesforce for its sales associates. Leads flow from its website to Salesforce using a batch integration in Salesforce. The batch job converts the leads to Accounts in Salesforce. Customers visiting their retail stores are also created in Salesforce as Accounts.

The company has noticed a large number of duplicate Accounts in Salesforce. On analysis, it was found that certain customers could interact with its website and also visit the store. The sales associates use Global Search to search for customers in Salesforce before they create the customers.

Which option should a data architect choose to implement to avoid duplicates?

A.
leverage duplicate rules in Salesforce to validate duplicates during the account creation process.
A.
leverage duplicate rules in Salesforce to validate duplicates during the account creation process.
Answers
B.
Develop an Apex class that searches for duplicates and removes them nightly.
B.
Develop an Apex class that searches for duplicates and removes them nightly.
Answers
C.
Implement an MDM solution to validate the customer information before creating Salesforce.
C.
Implement an MDM solution to validate the customer information before creating Salesforce.
Answers
D.
Build a custom search functionality that allows sales associates to search for customer in real time upon visiting their retail stores.
D.
Build a custom search functionality that allows sales associates to search for customer in real time upon visiting their retail stores.
Answers
Suggested answer: A

Explanation:

Leveraging duplicate rules in Salesforce to validate duplicates during the account creation process (option A) is the best option to implement to avoid duplicates, as it allows the sales associates to identify and merge duplicate accounts before they are saved. Developing an Apex class that searches for duplicates and removes them nightly (option B) is not a good option, as it may cause data loss or conflicts, and it does not prevent duplicates from being created in the first place. Implementing an MDM solution to validate the customer information before creating Salesforce (option C) is also not a good option, as it may introduce additional complexity and cost, and it does not address the issue of customers interacting with both the website and the store. Building a custom search functionality that allows sales associates to search for customer in real time upon visiting their retail stores (option D) is also not a good option, as it may not be reliable or user-friendly, and it does not leverage the existing Global Search feature.

Northern Trail Outfitters (NTO) has implemented Salesforce for its sales users. The opportunity management in Saiesforce Is implemented as follows:

1. Sales users enter their opportunities in Salesforce for forecasting and reporting purposes.

2. NTO has a product pricing system (PPS) that is used to update the Opportunity Amount field on opportunities on a daily basis.

3. PPS is the trusted source within NTO for Opportunity Amount.

4. NTO uses Opportunity Forecast for its sales planning and management.

Sales users have noticed that their updates to the Opportunity Amount field are overwritten when PPS updates their opportunities.

How should a data architect address this overwriting issue?

A.
Create a custom field for Opportunity amount that PSS updates separating the field sales user updates.
A.
Create a custom field for Opportunity amount that PSS updates separating the field sales user updates.
Answers
B.
Change PSS integration to update only Opportunity Amount field when the value is null.
B.
Change PSS integration to update only Opportunity Amount field when the value is null.
Answers
C.
Change Opportunity Amount field access to Read Only for sales users field-level security.
C.
Change Opportunity Amount field access to Read Only for sales users field-level security.
Answers
D.
Create a custom field for Opportunity amount that sales users update separating the field that PPS updates.
D.
Create a custom field for Opportunity amount that sales users update separating the field that PPS updates.
Answers
Suggested answer: C

Explanation:

Changing Opportunity Amount field access to Read Only for sales users field-level security (option C) is the best way to address the overwriting issue, as it prevents sales users from updating the field that is controlled by PPS, and ensures data consistency and accuracy. Creating a custom field for Opportunity amount that PSS updates separating the field sales user updates (option A) or creating a custom field for Opportunity amount that sales users update separating the field that PPS updates (option D) are not good solutions, as they may create confusion and inconsistency with the Opportunity Forecast feature. Changing PSS integration to update only Opportunity Amount field when the value is null (option B) is also not a good solution, as it may cause data loss or conflicts with the sales users' inputs.

Universal Containers (UC) has a Salesforce org with multiple automated processes defined for group membership processing, UC also has multiple admins on staff that perform manual adjustments to the role hierarchy. The automated tasks and manual tasks overlap daily, and UC is experiencing 'lock errors' consistently.

What should a data architect recommend to mitigate these errors?

A.
Enable granular locking.
A.
Enable granular locking.
Answers
B.
Remove SOQL statements from Apex Loops.
B.
Remove SOQL statements from Apex Loops.
Answers
C.
Enable sharing recalculations.
C.
Enable sharing recalculations.
Answers
D.
Ask Salesforce support for additional CPU power.
D.
Ask Salesforce support for additional CPU power.
Answers
Suggested answer: A

Explanation:

Enabling granular locking (option A) is the best recommendation to mitigate these errors, as it allows finer control over how records are locked during automated or manual processes, and reduces the chances of lock contention or deadlock. Removing SOQL statements from Apex Loops (option B) is a good practice for improving performance and avoiding governor limits, but it does not directly address the lock errors issue. Enabling sharing recalculations (option C) is also not relevant for this issue, as it is used to update sharing rules and recalculate access for records. Asking Salesforce support for additional CPU power (option D) is also not a viable solution, as it does not solve the root cause of the lock errors.

Universal Containers (UC) needs to run monthly and yearly reports on opportunities and orders for sales reporting. There are 5 million opportunities and 10 million orders. Sales users are complaining that the report will regularly timeout.

What is the fastest and most effective way for a data architect to solve the time-out issue?

A.
Create custom fields on opportunity, and copy data from order into those custom fields and run all reports on Opportunity object.
A.
Create custom fields on opportunity, and copy data from order into those custom fields and run all reports on Opportunity object.
Answers
B.
Extract opportunity and order data from Salesforce, and use a third-party reporting tool to run reports outside of Salesforce.
B.
Extract opportunity and order data from Salesforce, and use a third-party reporting tool to run reports outside of Salesforce.
Answers
C.
Create a skinny table in Salesforce, and copy order and opportunity fields into the skinny table and create the required reports on It.
C.
Create a skinny table in Salesforce, and copy order and opportunity fields into the skinny table and create the required reports on It.
Answers
D.
Create an aggregate custom object that summarizes the monthly and yearly values into the required format for the required reports.
D.
Create an aggregate custom object that summarizes the monthly and yearly values into the required format for the required reports.
Answers
Suggested answer: D

Explanation:

Creating an aggregate custom object that summarizes the monthly and yearly values into the required format for the required reports (option D) is the fastest and most effective way for a data architect to solve the time-out issue, as it reduces the amount of data that needs to be queried and processed by the reports. Creating custom fields on opportunity and copying data from order into those custom fields (option A) is not a good solution, as it may create data redundancy and inconsistency, and it does not address the large volume of data. Extracting opportunity and order data from Salesforce and using a third-party reporting tool (option B) is also not a good solution, as it may introduce additional complexity and cost, and it does not leverage the native reporting features of Salesforce. Creating a skinny table in Salesforce and copying order and opportunity fields into it (option C) is also not a good solution, as it may not support all the reporting requirements, and it does not reduce the number of records.

Total 260 questions
Go to page: of 26