ExamGecko
Home Home / Salesforce / Certified Data Architect

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 12

Question list
Search
Search

List of questions

Search

Related questions











Two million Opportunities need to be loaded in different batches into Salesforce using the Bulk API in parallel mode.

What should an Architect consider when loading the Opportunity records?

A.
Use the Name field values to sort batches.
A.
Use the Name field values to sort batches.
Answers
B.
Order batches by Auto-number field.
B.
Order batches by Auto-number field.
Answers
C.
Create indexes on Opportunity object text fields.
C.
Create indexes on Opportunity object text fields.
Answers
D.
Group batches by the AccountId field.
D.
Group batches by the AccountId field.
Answers
Suggested answer: D

Explanation:

Grouping batches by the AccountId field can improve the performance and avoid locking issues when loading Opportunity records using the Bulk API in parallel mode1.This is because the AccountId field is indexed and can be used to distribute the records evenly across batches

Ursa Major Solar has defined a new Data Quality Plan for their Salesforce data.

Which two approaches should an Architect recommend to enforce the plan throughout the organization? (Choose two.)

A.
Ensure all data is stored in an external system and set up an integration to Salesforce for view-only access.
A.
Ensure all data is stored in an external system and set up an integration to Salesforce for view-only access.
Answers
B.
Schedule reports that will automatically catch duplicates and merge or delete the records every week.
B.
Schedule reports that will automatically catch duplicates and merge or delete the records every week.
Answers
C.
Enforce critical business processes by using Workflow, Validation Rules, and Apex code.
C.
Enforce critical business processes by using Workflow, Validation Rules, and Apex code.
Answers
D.
Schedule a weekly dashboard displaying records that are missing information to be sent to managers for review.
D.
Schedule a weekly dashboard displaying records that are missing information to be sent to managers for review.
Answers
Suggested answer: C, D

Explanation:

Enforcing critical business processes by using Workflow, Validation Rules, and Apex code can help ensure data quality and consistency by applying rules and logic to the data entry and update3.Scheduling a weekly dashboard displaying records that are missing information to be sent to managers for review can help identify and fix data quality issues by providing visibility and accountability4.

DreamHouse Realty has a Salesforce deployment that manages Sales, Support, and Marketing efforts in a multi-system ERP environment. The company recently reached the limits of native reports and dashboards and needs options for providing more analytical insights.

What are two approaches an Architect should recommend? (Choose two.)

A.
Weekly Snapshots
A.
Weekly Snapshots
Answers
B.
Einstein Analytics
B.
Einstein Analytics
Answers
C.
Setup Audit Trails
C.
Setup Audit Trails
Answers
D.
AppExchange Apps
D.
AppExchange Apps
Answers
Suggested answer: B, D

Explanation:

Einstein Analytics can provide more analytical insights than native reports and dashboards by allowing users to explore data from multiple sources, create interactive visualizations, and apply AI-powered features5.AppExchange Apps can also provide more analytical insights by offering pre-built solutions or integrations with external tools that can enhance the reporting and analytics capabilities of Salesforce6.

Cloud Kicks currently has a Public Read/Write sharing model for the company's Contacts. Cloud Kicks management team requests that only the owner of a contact record be allowed to delete that contact.

What should an Architect do to meet these requirements?

A.
Set the profile of the users to remove delete permission from the Contact object.
A.
Set the profile of the users to remove delete permission from the Contact object.
Answers
B.
Check if the current user is NOT the owner by creating a 'before delete' trigger.
B.
Check if the current user is NOT the owner by creating a 'before delete' trigger.
Answers
C.
Set the Sharing settings as Public Read Only for the Contact object.
C.
Set the Sharing settings as Public Read Only for the Contact object.
Answers
D.
Check if the current user is NOT the owner by creating a validation rule on the Contact object.
D.
Check if the current user is NOT the owner by creating a validation rule on the Contact object.
Answers
Suggested answer: B

Explanation:

Checking if the current user is NOT the owner by creating a ''before delete'' trigger can meet the requirement of allowing only the owner of a contact record to delete that contact. A trigger is a piece of Apex code that can execute before or after a record is inserted, updated, deleted, or undeleted. A ''before delete'' trigger can prevent the deletion of a record by using theaddError()method.

An Architect needs information about who is creating, changing, or deleting certain fields within the past four months.

How can the Architect access this information?

A.
Create a field history report for the fields in question.
A.
Create a field history report for the fields in question.
Answers
B.
After exporting the setup audit trail, find the fields in question.
B.
After exporting the setup audit trail, find the fields in question.
Answers
C.
After exporting the metadata, search it for the fields in question.
C.
After exporting the metadata, search it for the fields in question.
Answers
D.
Remove 'customize application' permissions from everyone else.
D.
Remove 'customize application' permissions from everyone else.
Answers
Suggested answer: B

Explanation:

Exporting the setup audit trail can provide information about who is creating, changing, or deleting certain fields within the past four months. The setup audit trail tracks the recent setup changes that administrators and other users have made to the organization. The setup audit trail history shows up to 20 most recent changes in the Setup area, but administrators can download a report (in CSV format) of up to six months of setup history.

Universal Containers has more than 10 million records in the Order_c object. The query has timed out when running a bulk query. What should be considered to resolve query timeout?

A.
Tooling API
A.
Tooling API
Answers
B.
PK Chunking
B.
PK Chunking
Answers
C.
Metadata API
C.
Metadata API
Answers
D.
Streaming API
D.
Streaming API
Answers
Suggested answer: B

Explanation:

PK Chunking can resolve query timeout when running a bulk query on an object with more than 10 million records. PK Chunking is a feature of the Bulk API that splits a query into multiple batches based on the record IDs (primary keys) of the queried object. This can improve the query performance and avoid timeouts by reducing the number of records processed in each batch.

Universal Containers (UC) has a data model as shown in the image. The Project object has a private sharing model, and it has Roll -Up summary fields to calculate the number of resources assigned to the project, total

hours for the project, and the number of work items associated to the project. What should the architect consider, knowing there will be a large amount of time entry records to be loaded regularly from an external

system into Salesforce.com?

A.
Load all data using external IDs to link to parent records.
A.
Load all data using external IDs to link to parent records.
Answers
B.
Use workflow to calculate summary values instead of Roll -Up.
B.
Use workflow to calculate summary values instead of Roll -Up.
Answers
C.
Use triggers to calculate summary values instead of Roll -Up.
C.
Use triggers to calculate summary values instead of Roll -Up.
Answers
D.
Load all data after deferring sharing calculations.
D.
Load all data after deferring sharing calculations.
Answers
Suggested answer: D

Explanation:

Loading all data after deferring sharing calculations can improve the performance and avoid locking issues when loading a large amount of time entry records into Salesforce.com.This is because deferring sharing calculations can temporarily suspend the calculation of sharing rules until all the data is loaded, and then recalculate them in one operation

Which two aspects of data does an Enterprise data governance program aim to improve?

A.
Data integrity
A.
Data integrity
Answers
B.
Data distribution
B.
Data distribution
Answers
C.
Data usability
C.
Data usability
Answers
D.
Data modeling
D.
Data modeling
Answers
Suggested answer: A, C

Explanation:

Data integrity and data usability are two aspects of data that an Enterprise data governance program aims to improve.Data integrity refers to the accuracy, consistency, and validity of the data across the enterprise2.Data usability refers to the ease of access, analysis, and interpretation of the data by the end users

Universal Containers (UC) has over 10 million accounts with an average of 20 opportunities with each account. A Sales Executive at UC needs to generate a daily report for all opportunities in a specific opportunity stage.

Which two key considerations should be made to make sure the performance of the report is not degraded due to large data volume?

A.
Number of queries running at a time.
A.
Number of queries running at a time.
Answers
B.
Number of joins used in report query.
B.
Number of joins used in report query.
Answers
C.
Number of records returned by report query.
C.
Number of records returned by report query.
Answers
D.
Number of characters in report query.
D.
Number of characters in report query.
Answers
Suggested answer: B, C

Explanation:

The number of joins used in report query and the number of records returned by report query are two key considerations to make sure the performance of the report is not degraded due to large data volume.The number of joins used in report query affects the complexity and execution time of the query, especially when joining multiple large objects4. The number of records returned by report query affects the amount of data that needs to be processed and displayed by the report engine.

A health care provider wishes to use salesforce to track patient care. The following actions are in Salesforce

1. Payment Providers: Orgas who pay for the care 2 patients.

2. Doctors: They provide care plan for patients and need to support multiple patients, they are provided access to patient information.

3. Patients: They are individuals who need care.

A data architect needs to map the actor to Sf objects. What should be the optimal selection by the data architect?

A.
Patients as Contacts, Payment providers as Accounts, & Doctors as Accounts
A.
Patients as Contacts, Payment providers as Accounts, & Doctors as Accounts
Answers
B.
Patients as Person Accounts, Payment providers as Accounts, & Doctors as Contacts
B.
Patients as Person Accounts, Payment providers as Accounts, & Doctors as Contacts
Answers
C.
Patients as Person Accounts, Payment providers as Accounts, & Doctors as Person Account
C.
Patients as Person Accounts, Payment providers as Accounts, & Doctors as Person Account
Answers
D.
Patients as Accounts, Payment providers as Accounts, & Doctors as Person Accounts
D.
Patients as Accounts, Payment providers as Accounts, & Doctors as Person Accounts
Answers
Suggested answer: C

Explanation:

Patients as Person Accounts, Payment providers as Accounts, & Doctors as Person Accounts is the optimal selection by the data architect to map the actor to Salesforce objects. This is because Person Accounts are a special type of accounts that can store both business and personal information for individual customers. Payment providers are organizations that pay for the care of patients, so they can be modeled as Accounts. Doctors are also individuals who provide care plans for patients and need access to patient information, so they can also be modeled as Person Accounts.

Total 260 questions
Go to page: of 26