Salesforce Certified Data Architect Practice Test - Questions Answers, Page 15
List of questions
Question 141

(NTO) has multiple salesforce orgs based on geographical reports (AMER, EMEA, APAC). NTO products are in the AMER org and need to be created in the EMEA and APAC after the products are approved.
Which two features should a data architect recommendto share records between salesforce orgs? Choose 2.
Explanation:
The two features that a data architect should recommend to share records between Salesforce orgs are:
Change Data Capture (CDC): This is a feature that publishes change events for changes in Salesforce records, such as create, update, delete, and undelete operations. Change events can be subscribed by external systems or other Salesforce orgs using various tools, such as Platform Events, Streaming API, or CometD.CDC can help keep data in sync between Salesforce orgs in near real time89.
Salesforce to Salesforce (S2S): This is a feature that allows sharing records and related data with other Salesforce orgs that you partner with. You can choose which records and fields to share, and which orgs to share with.You can also accept updates from the other orgs to keep the data consistent10.
Salesforce Connect is not a good feature to share records between Salesforce orgs because it is intended for integrating external data sources with Salesforce using external objects. Federation Search is not a good feature to share records between Salesforce orgs because it is intended for searching across multiple Salesforce orgs without sharing data.
Question 142

NTO has been using salesforce for sales and service for 10 years. For the past 2 years, the marketing group has noticed a raise from 0 to 35 % in returned mail when sending mail using the contact information stored in salesforce.
Which solution should the data architect use to reduce the amount of returned mails?
Explanation:
Using a third-party data source to update contact information in Salesforce is the best solution to reduce the amount of returned mails.This way, the data architect can ensure that the contact information is accurate and up-to-date without relying on manual verification or deletion of contacts
Question 143

UC is rolling out Sales App globally to bring sales teams together on one platform. UC expects millions of opportunities and accounts to be creates and is concerned about the performance of the application.
Which 3 recommendations should the data architect make to avoid the data skew? Choose 3 answers.
Explanation:
Data skew occurs when a large number of child records are associated with a single parent record, or when a single user owns a large number of records. This can cause performance issues and lock contention.To avoid data skew, the data architect should limit assigning one user 10,000 records ownership, limit associating 10,000 opportunities to one account, and limit associating 10,000 records looking up to the same record
Question 144

UC has to built a B2C ecommerce site on Heroku that shares customer and order data with a Heroku Postgres database. UC is currently utilizing Postgres as the single source of truth for both customers and orders. UC has asked a data architect to replicate the data into salesforce so that salesforce can now act as the system of record.
What are the 3 considerations that data architect should weigh before implementing this requirement? Choose 23 answers:
Explanation:
Before replicating the data from Heroku Postgres to Salesforce, the data architect should consider the following factors:
Whether the data is a driver of key processes implemented within Salesforce. For example, if the data is used for workflows, triggers, or validation rules, it should be replicated to Salesforce.
Whether there is a tight relationship between order data and an enterprise resource planning (ERP) application. For example, if the order data needs to be synchronized with the ERP system, it should be replicated to Salesforce.
The selection of the tool required to replicate the data.For example, Heroku Connect can be used to bi-directionally sync data between Heroku Postgres and Salesforce
Question 145

What 2 data management policies does the data classification feature allow customers to classify in salesforce? Choose 2 answers:
Explanation:
The data classification feature allows customers to classify their data in Salesforce based on two policies:
Data sensitivity level: This policy defines how sensitive the data is and what level of protection it requires. For example, high sensitivity data may require encryption or masking.
Compliance categorization policy: This policy defines how the data is regulated by various laws and standards. For example, GDPR or PCI DSS.
Question 146

NTO has multiple systems across its enterprise landscape including salesforce, with disparate version the customer records.
In salesforce, the customer is represented by the contact object.
NTO utilizes an MDM solution with these attributes:
1. The MDM solution keeps track of customer master with a master key.
2. The master key is a map to the record ID's from each external system that customer data is stored within.
3. The MDM solution provides de-duplication features, so it acts as the single source of truth.
How should a data architect implement the storage of master key within salesforce?
Explanation:
The best way to implement the storage of master key within Salesforce is to store it on the contact object as an external ID field for referential imports. This way, the data architect can use the master key as a unique identifier to match records from different systems and avoid duplicates. The other options are not feasible because they either require additional storage or do not support referential imports.
Question 147

UC has large amount of orders coming in from its online portal. Historically all order are assigned to a generic user.
Which 2 measures should data architect recommend to avoid any performance issues while working with large number of order records? Choose 2 answers:
Explanation:
Clearing the role field in the generic user record and creating a role at the top of the role hierarchy and assigning it to the generic user are two measures that can help avoid performance issues while working with large number of order records.These measures can prevent data skew and lock contention that may occur when a single user owns or shares a large number of records
Question 148

NTO has implemented salesforce for its sales users. The opportunity management in salesforce is implemented as follows:
1. Sales users enter their opportunities in salesforce for forecasting and reporting purposes.
2. NTO has a product pricing system (PPS) that is used to update opportunity amount field on opportunities on a daily basis.
3. PPS is the trusted source within the NTO for opportunity amount.
4. NTO uses opportunity forecast for its sales planning and management.
Sales users have noticed that their updates to the opportunity amount field are overwritten when PPS updates their opportunities.
How should a data architect address this overriding issue?
Explanation:
Changing the opportunity amount field access to read only for sales users using field level security is the best way to address the overriding issue. This way, the sales users can still view the opportunity amount field but cannot edit it, and PPS can update it as the trusted source
Question 149

UC is using SF CRM. UC sales managers are complaining about data quality and would like to monitor and measure data quality.
Which 2 solutions should a data architect recommend to monitor and measure data quality?
Choose 2 answers.
Explanation:
Reviewing data quality reports and dashboards and installing and running data quality analysis dashboard app are two solutions that can help monitor and measure data quality. Data quality reports and dashboards can provide insights into the completeness, accuracy, and consistency of the data. Data quality analysis dashboard app is a free app from AppExchange that can help analyze and improve data quality by identifying duplicate, incomplete, or inaccurate records.
Question 150

UC has multiple SF orgs that are distributed across regional branches. Each branch stores local customer data inside its org's Account and Contact objects. This creates a scenario where UC is unable to view customers across all orgs.
UC has an initiative to create a 360-degree view of the customer, as UC would like to see Account and Contact data from all orgs in one place.
What should a data architect suggest to achieve this 360-degree view of the customer?
Explanation:
Consolidating the data from each org into a centralized datastore is the best suggestion to achieve a 360-degree view of the customer. This way, UC can have a single source of truth for all customer data and avoid data silos and inconsistencies. The other options are not feasible because they either require complex integration, additional cost, or data duplication.
Question