ExamGecko
Home Home / Salesforce / Certified Data Architect

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 15

Question list
Search
Search

List of questions

Search

Related questions











(NTO) has multiple salesforce orgs based on geographical reports (AMER, EMEA, APAC). NTO products are in the AMER org and need to be created in the EMEA and APAC after the products are approved.

Which two features should a data architect recommendto share records between salesforce orgs? Choose 2.

A.
Change data capture (CDC)
A.
Change data capture (CDC)
Answers
B.
Salesforce connect.
B.
Salesforce connect.
Answers
C.
Federation search
C.
Federation search
Answers
D.
Salesforce 2 Salesforce
D.
Salesforce 2 Salesforce
Answers
Suggested answer: A, D

Explanation:

The two features that a data architect should recommend to share records between Salesforce orgs are:

Change Data Capture (CDC): This is a feature that publishes change events for changes in Salesforce records, such as create, update, delete, and undelete operations. Change events can be subscribed by external systems or other Salesforce orgs using various tools, such as Platform Events, Streaming API, or CometD.CDC can help keep data in sync between Salesforce orgs in near real time89.

Salesforce to Salesforce (S2S): This is a feature that allows sharing records and related data with other Salesforce orgs that you partner with. You can choose which records and fields to share, and which orgs to share with.You can also accept updates from the other orgs to keep the data consistent10.

Salesforce Connect is not a good feature to share records between Salesforce orgs because it is intended for integrating external data sources with Salesforce using external objects. Federation Search is not a good feature to share records between Salesforce orgs because it is intended for searching across multiple Salesforce orgs without sharing data.

NTO has been using salesforce for sales and service for 10 years. For the past 2 years, the marketing group has noticed a raise from 0 to 35 % in returned mail when sending mail using the contact information stored in salesforce.

Which solution should the data architect use to reduce the amount of returned mails?

A.
Use a 3rd-party data source to update contact information in salesforce.
A.
Use a 3rd-party data source to update contact information in salesforce.
Answers
B.
Email all customer and asked them to verify their information and to call NTO if their address is incorrect.
B.
Email all customer and asked them to verify their information and to call NTO if their address is incorrect.
Answers
C.
Delete contacts when the mail is returned to save postal cost to NTO.
C.
Delete contacts when the mail is returned to save postal cost to NTO.
Answers
D.
Have the sales team to call all existing customers and ask to verify the contact details.
D.
Have the sales team to call all existing customers and ask to verify the contact details.
Answers
Suggested answer: A

Explanation:

Using a third-party data source to update contact information in Salesforce is the best solution to reduce the amount of returned mails.This way, the data architect can ensure that the contact information is accurate and up-to-date without relying on manual verification or deletion of contacts

UC is rolling out Sales App globally to bring sales teams together on one platform. UC expects millions of opportunities and accounts to be creates and is concerned about the performance of the application.

Which 3 recommendations should the data architect make to avoid the data skew? Choose 3 answers.

A.
Use picklist fields rather than lookup to custom object.
A.
Use picklist fields rather than lookup to custom object.
Answers
B.
Limit assigning one user 10000 records ownership.
B.
Limit assigning one user 10000 records ownership.
Answers
C.
Assign 10000 opportunities to one account.
C.
Assign 10000 opportunities to one account.
Answers
D.
Limit associating 10000 opportunities to one account.
D.
Limit associating 10000 opportunities to one account.
Answers
E.
Limit associating 10000 records looking up to same records.
E.
Limit associating 10000 records looking up to same records.
Answers
Suggested answer: B, D, E

Explanation:

Data skew occurs when a large number of child records are associated with a single parent record, or when a single user owns a large number of records. This can cause performance issues and lock contention.To avoid data skew, the data architect should limit assigning one user 10,000 records ownership, limit associating 10,000 opportunities to one account, and limit associating 10,000 records looking up to the same record

UC has to built a B2C ecommerce site on Heroku that shares customer and order data with a Heroku Postgres database. UC is currently utilizing Postgres as the single source of truth for both customers and orders. UC has asked a data architect to replicate the data into salesforce so that salesforce can now act as the system of record.

What are the 3 considerations that data architect should weigh before implementing this requirement? Choose 23 answers:

A.
Consider whether the data is required for sales reports, dashboards and KPI's.
A.
Consider whether the data is required for sales reports, dashboards and KPI's.
Answers
B.
Determine if the data is driver of key process implemented within salesforce.
B.
Determine if the data is driver of key process implemented within salesforce.
Answers
C.
Ensure there is a tight relationship between order data and an enterprise resource plaining (ERP) application.
C.
Ensure there is a tight relationship between order data and an enterprise resource plaining (ERP) application.
Answers
D.
Ensure the data is CRM center and able to populate standard of custom objects.
D.
Ensure the data is CRM center and able to populate standard of custom objects.
Answers
E.
A selection of the tool required to replicate the data. a. -- Heroku Connect is required but this is confusing
E.
A selection of the tool required to replicate the data. a. -- Heroku Connect is required but this is confusing
Answers
Suggested answer: B, C, E

Explanation:

Before replicating the data from Heroku Postgres to Salesforce, the data architect should consider the following factors:

Whether the data is a driver of key processes implemented within Salesforce. For example, if the data is used for workflows, triggers, or validation rules, it should be replicated to Salesforce.

Whether there is a tight relationship between order data and an enterprise resource planning (ERP) application. For example, if the order data needs to be synchronized with the ERP system, it should be replicated to Salesforce.

The selection of the tool required to replicate the data.For example, Heroku Connect can be used to bi-directionally sync data between Heroku Postgres and Salesforce

What 2 data management policies does the data classification feature allow customers to classify in salesforce? Choose 2 answers:

A.
Reference data policy.
A.
Reference data policy.
Answers
B.
Data governance policy.
B.
Data governance policy.
Answers
C.
Data sensitivity level
C.
Data sensitivity level
Answers
D.
Compliance categorization policy.
D.
Compliance categorization policy.
Answers
Suggested answer: C, D

Explanation:

The data classification feature allows customers to classify their data in Salesforce based on two policies:

Data sensitivity level: This policy defines how sensitive the data is and what level of protection it requires. For example, high sensitivity data may require encryption or masking.

Compliance categorization policy: This policy defines how the data is regulated by various laws and standards. For example, GDPR or PCI DSS.

NTO has multiple systems across its enterprise landscape including salesforce, with disparate version the customer records.

In salesforce, the customer is represented by the contact object.

NTO utilizes an MDM solution with these attributes:

1. The MDM solution keeps track of customer master with a master key.

2. The master key is a map to the record ID's from each external system that customer data is stored within.

3. The MDM solution provides de-duplication features, so it acts as the single source of truth.

How should a data architect implement the storage of master key within salesforce?

A.
Store the master key in Heroku postgres and use Heroku connect for synchronization.
A.
Store the master key in Heroku postgres and use Heroku connect for synchronization.
Answers
B.
Create a custom object to store the master key with a lookup field to contact.
B.
Create a custom object to store the master key with a lookup field to contact.
Answers
C.
Create an external object to store the master key with a lookup field to contact.
C.
Create an external object to store the master key with a lookup field to contact.
Answers
D.
Store the master key on the contact object as an external ID (Field for referential imports)
D.
Store the master key on the contact object as an external ID (Field for referential imports)
Answers
Suggested answer: D

Explanation:

The best way to implement the storage of master key within Salesforce is to store it on the contact object as an external ID field for referential imports. This way, the data architect can use the master key as a unique identifier to match records from different systems and avoid duplicates. The other options are not feasible because they either require additional storage or do not support referential imports.

UC has large amount of orders coming in from its online portal. Historically all order are assigned to a generic user.

Which 2 measures should data architect recommend to avoid any performance issues while working with large number of order records? Choose 2 answers:

A.
Clear the role field in the generic user record.
A.
Clear the role field in the generic user record.
Answers
B.
Salesforce handles the assignment of orders automatically and there is no performance impact.
B.
Salesforce handles the assignment of orders automatically and there is no performance impact.
Answers
C.
Create a role at top of role hierarchy and assign the role to the generic user.
C.
Create a role at top of role hierarchy and assign the role to the generic user.
Answers
D.
Create a pool of generic users and distribute the assignment of memory to the pool of users.
D.
Create a pool of generic users and distribute the assignment of memory to the pool of users.
Answers
Suggested answer: A, C

Explanation:

Clearing the role field in the generic user record and creating a role at the top of the role hierarchy and assigning it to the generic user are two measures that can help avoid performance issues while working with large number of order records.These measures can prevent data skew and lock contention that may occur when a single user owns or shares a large number of records

NTO has implemented salesforce for its sales users. The opportunity management in salesforce is implemented as follows:

1. Sales users enter their opportunities in salesforce for forecasting and reporting purposes.

2. NTO has a product pricing system (PPS) that is used to update opportunity amount field on opportunities on a daily basis.

3. PPS is the trusted source within the NTO for opportunity amount.

4. NTO uses opportunity forecast for its sales planning and management.

Sales users have noticed that their updates to the opportunity amount field are overwritten when PPS updates their opportunities.

How should a data architect address this overriding issue?

A.
Create a custom field for opportunity amount that sales users update separating the fields that PPS updates.
A.
Create a custom field for opportunity amount that sales users update separating the fields that PPS updates.
Answers
B.
Create a custom field for opportunity amount that PPS updates separating the field that sales user updates.
B.
Create a custom field for opportunity amount that PPS updates separating the field that sales user updates.
Answers
C.
Change opportunity amount field access to read only for sales users using field level security.
C.
Change opportunity amount field access to read only for sales users using field level security.
Answers
D.
Change PPS integration to update only opportunity amount fields when values is NULL.
D.
Change PPS integration to update only opportunity amount fields when values is NULL.
Answers
Suggested answer: C

Explanation:

Changing the opportunity amount field access to read only for sales users using field level security is the best way to address the overriding issue. This way, the sales users can still view the opportunity amount field but cannot edit it, and PPS can update it as the trusted source

UC is using SF CRM. UC sales managers are complaining about data quality and would like to monitor and measure data quality.

Which 2 solutions should a data architect recommend to monitor and measure data quality?

Choose 2 answers.

A.
Use custom objects and fields to identify issues.
A.
Use custom objects and fields to identify issues.
Answers
B.
Review data quality reports and dashboards.
B.
Review data quality reports and dashboards.
Answers
C.
Install and run data quality analysis dashboard app
C.
Install and run data quality analysis dashboard app
Answers
D.
Export data and check for data completeness outside of Salesforce.
D.
Export data and check for data completeness outside of Salesforce.
Answers
Suggested answer: B, C

Explanation:

Reviewing data quality reports and dashboards and installing and running data quality analysis dashboard app are two solutions that can help monitor and measure data quality. Data quality reports and dashboards can provide insights into the completeness, accuracy, and consistency of the data. Data quality analysis dashboard app is a free app from AppExchange that can help analyze and improve data quality by identifying duplicate, incomplete, or inaccurate records.

UC has multiple SF orgs that are distributed across regional branches. Each branch stores local customer data inside its org's Account and Contact objects. This creates a scenario where UC is unable to view customers across all orgs.

UC has an initiative to create a 360-degree view of the customer, as UC would like to see Account and Contact data from all orgs in one place.

What should a data architect suggest to achieve this 360-degree view of the customer?

A.
Consolidate the data from each org into a centralized datastore
A.
Consolidate the data from each org into a centralized datastore
Answers
B.
Use Salesforce Connect's cross-org adapter.
B.
Use Salesforce Connect's cross-org adapter.
Answers
C.
Build a bidirectional integration between all orgs.
C.
Build a bidirectional integration between all orgs.
Answers
D.
Use an ETL tool to migrate gap Accounts and Contacts into each org.
D.
Use an ETL tool to migrate gap Accounts and Contacts into each org.
Answers
Suggested answer: A

Explanation:

Consolidating the data from each org into a centralized datastore is the best suggestion to achieve a 360-degree view of the customer. This way, UC can have a single source of truth for all customer data and avoid data silos and inconsistencies. The other options are not feasible because they either require complex integration, additional cost, or data duplication.

Total 260 questions
Go to page: of 26