ExamGecko
Home Home / Salesforce / Certified Data Architect

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 6

Question list
Search
Search

List of questions

Search

Related questions











Get Cloudy Consulting is migrating their legacy system's users and data to Salesforce. They will be creating 15,000 users, 1.5 million Account records, and 15 million Invoice records. The visibility of these records is controlled by a 50 owner and criteria-based sharing rules.

Get Cloudy Consulting needs to minimize data loading time during this migration to a new organization.

Which two approaches will accomplish this goal? (Choose two.)

A.
Create the users, upload all data, and then deploy the sharing rules.
A.
Create the users, upload all data, and then deploy the sharing rules.
Answers
B.
Contact Salesforce to activate indexing before uploading the data.
B.
Contact Salesforce to activate indexing before uploading the data.
Answers
C.
First, load all account records, and then load all user records.
C.
First, load all account records, and then load all user records.
Answers
D.
Defer sharing calculations until the data has finished uploading.
D.
Defer sharing calculations until the data has finished uploading.
Answers
Suggested answer: A, D

Explanation:

Creating the users, uploading all data, and then deploying the sharing rules will reduce the number of sharing recalculations that occur during the data load. Deferring sharing calculations until the data has finished uploading will also improve the performance by postponing the sharing rule evaluation.These are the recommended best practices for loading large data sets into Salesforce

Cloud Kicks has the following requirements:

- Data needs to be sent from Salesforce to an external system to generate invoices from their Order Management System (OMS).

- A Salesforce administrator must be able to customize which fields will be sent to the external system without changing code.

What are two approaches for fulfilling these requirements? (Choose two.)

A.
A set<sobjectFieldset> to determine which fields to send in an HTTP callout.
A.
A set<sobjectFieldset> to determine which fields to send in an HTTP callout.
Answers
B.
An Outbound Message to determine which fields to send to the OMS.
B.
An Outbound Message to determine which fields to send to the OMS.
Answers
C.
A Field Set that determines which fields to send in an HTTP callout.
C.
A Field Set that determines which fields to send in an HTTP callout.
Answers
D.
Enable the field -level security permissions for the fields to send.
D.
Enable the field -level security permissions for the fields to send.
Answers
Suggested answer: B, C

Explanation:

An Outbound Message is a native Salesforce feature that allows sending data to an external system without code.It can be configured to include any fields from the source object3. A Field Set is a collection of fields that can be used in Visualforce pages or Apex classes to dynamically determine the fields to send in an HTTP callout. Both of these approaches meet the requirements of Cloud Kicks.

The architect is planning a large data migration for Universal Containers from their legacy CRM system to Salesforce. What three things should the architect consider to optimize performance of the data migration? Choose 3 answers

A.
Review the time zones of the User loading the data.
A.
Review the time zones of the User loading the data.
Answers
B.
Remove custom indexes on the data being loaded.
B.
Remove custom indexes on the data being loaded.
Answers
C.
Determine if the legacy system is still in use.
C.
Determine if the legacy system is still in use.
Answers
D.
Defer sharing calculations of the Salesforce Org.
D.
Defer sharing calculations of the Salesforce Org.
Answers
E.
Deactivate approval processes and workflow rules.
E.
Deactivate approval processes and workflow rules.
Answers
Suggested answer: B, D, E

Explanation:

Removing custom indexes on the data being loaded will prevent unnecessary index maintenance and improve the data load speed. Deferring sharing calculations of the Salesforce Org will avoid frequent sharing rule evaluations and reduce the load time. Deactivating approval processes and workflow rules will prevent triggering any automation logic that might slow down or fail the data load.

Universal Containers has a large volume of Contact data going into Salesforce.com. There are 100,000 existing contact records. 200,000 new contacts will be loaded. The Contact object has an external ID field that is unique and must be populated for all existing records. What should the architect recommend to reduce data load processing time?

A.
Load Contact records together using the Streaming API via the Upsert operation.
A.
Load Contact records together using the Streaming API via the Upsert operation.
Answers
B.
Delete all existing records, and then load all records together via the Insert operation.
B.
Delete all existing records, and then load all records together via the Insert operation.
Answers
C.
Load all records via the Upsert operation to determine new records vs. existing records.
C.
Load all records via the Upsert operation to determine new records vs. existing records.
Answers
D.
Load new records via the Insert operation and existing records via the Update operation.
D.
Load new records via the Insert operation and existing records via the Update operation.
Answers
Suggested answer: D

Explanation:

Loading new records via the Insert operation and existing records via the Update operation will allow using the external ID field as a unique identifier and avoid any duplication or overwriting of records. This is faster and safer than deleting all existing records or using the Upsert operation, which might cause conflicts or errors .

An architect is planning on having different batches to load one million Opportunities into Salesforce using the Bulk API in parallel mode. What should be considered when loading the Opportunity records?

A.
Create indexes on Opportunity object text fields.
A.
Create indexes on Opportunity object text fields.
Answers
B.
Group batches by the AccountId field.
B.
Group batches by the AccountId field.
Answers
C.
Sort batches by Name field values.
C.
Sort batches by Name field values.
Answers
D.
Order batches by Auto -number field.
D.
Order batches by Auto -number field.
Answers
Suggested answer: D

Explanation:

Ordering batches by Auto-number field will ensure that the records are processed in a sequential order and avoid any locking issues that might occur when loading related records in parallel mode. Creating indexes, grouping batches by AccountId, or sorting batches by Name field values are not necessary or beneficial for loading Opportunity records using the Bulk API.

DreamHouse Realty has 15 million records in the Order_c custom object. When running a bulk query, the query times out.

What should be considered to address this issue?

A.
Tooling API
A.
Tooling API
Answers
B.
PK Chunking
B.
PK Chunking
Answers
C.
Metadata API
C.
Metadata API
Answers
D.
Streaming API
D.
Streaming API
Answers
Suggested answer: B

Explanation:

PK Chunking is a feature of the Bulk API that allows splitting a large query into smaller batches based on the primary key of the object.This can improve the performance and avoid query timeouts when querying large data sets

Company S was recently acquired by Company T. As part of the acquisition, all of the data for the Company S's Salesforce instance (source) must be migrated into the Company T's Salesforce instance (target). Company S has 6 million Case records.

An Architect has been tasked with optimizing the data load time.

What should the Architect consider to achieve this goal?

A.
Pre-process the data, then use Data Loader with SOAP API to upsert with zip compression enabled.
A.
Pre-process the data, then use Data Loader with SOAP API to upsert with zip compression enabled.
Answers
B.
Directly leverage Salesforce-to-Salesforce functionality to load Case data.
B.
Directly leverage Salesforce-to-Salesforce functionality to load Case data.
Answers
C.
Load the data in multiple sets using Bulk API parallel processes.
C.
Load the data in multiple sets using Bulk API parallel processes.
Answers
D.
Utilize the Salesforce Org Migration Tool from the Setup Data Management menu.
D.
Utilize the Salesforce Org Migration Tool from the Setup Data Management menu.
Answers
Suggested answer: A

Explanation:

Pre-processing the data means transforming and cleansing the data before loading it into Salesforce. This can reduce the errors and conflicts that might occur during the data load.Using Data Loader with SOAP API to upsert with zip compression enabled can also improve the performance and efficiency of the data load by reducing the network bandwidth and avoiding duplication

Universal Containers (UC) has users complaining about reports timing out or simply taking too long to run What two actions should the data architect recommend to improve the reporting experience? Choose 2 answers

A.
Index key fields used in report criteria.
A.
Index key fields used in report criteria.
Answers
B.
Enable Divisions for large data objects.
B.
Enable Divisions for large data objects.
Answers
C.
Create one skinny table per report.
C.
Create one skinny table per report.
Answers
D.
Share each report with fewer users.
D.
Share each report with fewer users.
Answers
Suggested answer: A, C

Explanation:

Indexing key fields used in report criteria can speed up the query execution and reduce the report run time. Indexes can be created by Salesforce automatically or manually by request. Creating one skinny table per report can also improve the reporting performance by storing frequently used fields in a separate table that does not include complex formulas or joins.

A company has 12 million records, and a nightly integration queries these records.

Which two areas should a Data Architect investigate during troubleshooting if queries are timing out? (Choose two.)

A.
Make sure the query doesn't contain NULL in any filter criteria.
A.
Make sure the query doesn't contain NULL in any filter criteria.
Answers
B.
Create a formula field instead of having multiple filter criteria.
B.
Create a formula field instead of having multiple filter criteria.
Answers
C.
Create custom indexes on the fields used in the filter criteria.
C.
Create custom indexes on the fields used in the filter criteria.
Answers
D.
Modify the integration users' profile to have View All Data.
D.
Modify the integration users' profile to have View All Data.
Answers
Suggested answer: A, C

Explanation:

Making sure the query does not contain NULL in any filter criteria can avoid full table scans and leverage indexes more efficiently. Queries with NULL filters are not selective and can cause performance issues. Creating custom indexes on the fields used in the filter criteria can also enhance the query performance by reducing the number of records to scan.

Universal Containers (UC) is implementing a Salesforce project with large volumes of data and daily transactions. The solution includes both real-time web service integrations and Visualforce mash -ups with back -end systems. The Salesforce Full sandbox used by the project integrates with full-scale back -end testing systems. What two types of performance testing are appropriate for this project?

Choose 2 answers

A.
Pre -go -live automated page -load testing against the Salesforce Full sandbox.
A.
Pre -go -live automated page -load testing against the Salesforce Full sandbox.
Answers
B.
Post go -live automated page -load testing against the Salesforce Production org.
B.
Post go -live automated page -load testing against the Salesforce Production org.
Answers
C.
Pre -go -live unit testing in the Salesforce Full sandbox.
C.
Pre -go -live unit testing in the Salesforce Full sandbox.
Answers
D.
Stress testing against the web services hosted by the integration middleware.
D.
Stress testing against the web services hosted by the integration middleware.
Answers
Suggested answer: A, D

Explanation:

Pre-go-live automated page-load testing against the Salesforce Full sandbox can help identify and resolve any performance bottlenecks or issues before deploying the solution to production. The Full sandbox is an ideal environment for performance testing as it replicates the production org in terms of data, metadata, and integrations. Stress testing against the web services hosted by the integration middleware can also help evaluate the scalability and reliability of the integration solution under high load conditions.

Total 260 questions
Go to page: of 26