ExamGecko
Home Home / Salesforce / Certified Data Architect

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 9

Question list
Search
Search

List of questions

Search

Related questions











Universal Containers (UC) is expecting to have nearly 5 million shipments records in its Salesforce org. Each shipment record has up to 10 child shipment item records. The Shipment custom object has an Organization-wide Default (OWD) sharing model set to Private and the Shipment Item custom object has a Master-Detail relationship to Shipment. There are 25 sharing rules set on the Shipment custom object, which allow shipment records to be shared to each of UC's 25 business areas around the globe. These sharing rules use public groups, one for each business area plus a number of groups for management and support roles. UC has a high turnover of Sales Reps and often needs to move Sales Reps between business areas in order to meet local demand. What feature would ensure that performance, when moving Sales Reps between regions, remains adequate while meeting existing requirements?

A.
Implement data archiving for old Shipment records.
A.
Implement data archiving for old Shipment records.
Answers
B.
Contact Salesforce to create Skinny tables on Shipment.
B.
Contact Salesforce to create Skinny tables on Shipment.
Answers
C.
Configure shipment OWD to Public Read/Write.
C.
Configure shipment OWD to Public Read/Write.
Answers
D.
Contact Salesforce to enable Defer Sharing Rules
D.
Contact Salesforce to enable Defer Sharing Rules
Answers
Suggested answer: D

Explanation:

Contacting Salesforce to enable Defer Sharing Rules is the feature that would ensure that performance, when moving Sales Reps between regions, remains adequate while meeting existing requirements. Defer Sharing Rules allows you to defer sharing rule recalculation when you make changes to public groups or roles that are used in sharing rules.This can improve performance and avoid locking issues when you move users between groups or roles2. Implementing data archiving for old Shipment records will not help with the performance issue related to sharing rules. Contacting Salesforce to create Skinny tables on Shipment will not help with the performance issue related to sharing rules. Configuring shipment OWD to Public Read/Write will not meet the existing requirements of having a Private sharing model.

A customer is facing locking issued when importing large data volumes of order records that are children in a master-detail relationship with the Account object. What is the recommended way to avoid locking issues during import?

A.
Import Account records first followed by order records after sorting order by OrderID.
A.
Import Account records first followed by order records after sorting order by OrderID.
Answers
B.
Import Account records first followed by order records after sorting orders by AccountID.
B.
Import Account records first followed by order records after sorting orders by AccountID.
Answers
C.
Change the relationship to Lookup and update the relationship to master-detail after import.
C.
Change the relationship to Lookup and update the relationship to master-detail after import.
Answers
D.
Import Order records and Account records separately and populate AccountID in orders using batch Apex.
D.
Import Order records and Account records separately and populate AccountID in orders using batch Apex.
Answers
Suggested answer: B

Explanation:

Importing Account records first followed by order records after sorting orders by AccountID is the recommended way to avoid locking issues during import. This can reduce the number of lock contention errors by minimizing the number of parent records that are concurrently processed by multiple batches.Sorting orders by AccountID can also group the child records by their parent records and avoid updating the same parent record in different batches3. Importing Account records first followed by order records after sorting order by OrderID will not help with avoiding locking issues because it does not group the child records by their parent records. Changing the relationship to Lookup and updating the relationship to master-detail after import will not work because changing a relationship from Lookup to master-detail requires that all child records have a parent record, which may not be the case after import. Importing Order records and Account records separately and populating AccountID in orders using batch Apex will not help with avoiding locking issues because it still requires updating the parent records in batches.

Universal Containers wants to develop a dashboard in Salesforce that will allow Sales Managers to do data exploration using their mobile device (i.e., drill down into sales-related data) and have the possibility of adding ad-hoc filters while on the move. What is a recommended solution for building data exploration dashboards in Salesforce?

A.
Create a Dashboard in an external reporting tool, export data to the tool, and add link to the dashboard in Salesforce.
A.
Create a Dashboard in an external reporting tool, export data to the tool, and add link to the dashboard in Salesforce.
Answers
B.
Create a Dashboard in an external reporting tool, export data to the tool, and embed the dashboard in Salesforce using the Canval toolkit.
B.
Create a Dashboard in an external reporting tool, export data to the tool, and embed the dashboard in Salesforce using the Canval toolkit.
Answers
C.
Create a standard Salesforce Dashboard and connect it to reports with the appropriate filters.
C.
Create a standard Salesforce Dashboard and connect it to reports with the appropriate filters.
Answers
D.
Create a Dashboard using Analytics Cloud that will allow the user to create ad-hoc lenses and drill down.
D.
Create a Dashboard using Analytics Cloud that will allow the user to create ad-hoc lenses and drill down.
Answers
Suggested answer: D

Explanation:

Creating a Dashboard using Analytics Cloud that will allow the user to create ad-hoc lenses and drill down is a recommended solution for building data exploration dashboards in Salesforce. Analytics Cloud is a powerful data analysis tool that enables users to explore data using interactive dashboards, charts, graphs, and tables on any device. Users can also create lenses, which are ad-hoc data queries that can be saved and reused, and drill down into data details using filters and facets. Creating a Dashboard in an external reporting tool, exporting data to the tool, and adding link to the dashboard in Salesforce will not provide a seamless user experience and may require additional data integration and security considerations. Creating a Dashboard in an external reporting tool, exporting data to the tool, and embedding the dashboard in Salesforce using the Canval toolkit will not provide a native Salesforce solution and may require additional data integration and security considerations. Creating a standard Salesforce Dashboard and connecting it to reports with the appropriate filters will not allow the user to create ad-hoc lenses and drill down into data details on their mobile device.

DreamHouse Realty has a legacy system that captures Branch Offices and Transactions. DreamHouse Realty has 15 Branch Offices. Transactions can relate to any Branch Office. DreamHouse Realty has created hundreds of thousands of Transactions per year.

A Data Architect needs to denormalize this data model into a single Transaction object with a Branch Office picklist.

What are two important considerations for the Data Architect in this scenario? (Choose two.)

A.
Standard list view in-line editing.
A.
Standard list view in-line editing.
Answers
B.
Limitations on Org data storage.
B.
Limitations on Org data storage.
Answers
C.
Bulk API limitations on picklist fields.
C.
Bulk API limitations on picklist fields.
Answers
D.
Limitations on master-detail relationships.
D.
Limitations on master-detail relationships.
Answers
Suggested answer: B, C

Explanation:

The Data Architect should consider the limitations on Org data storage and the Bulk API limitations on picklist fields when denormalizing the data model into a single Transaction object with a Branch Office picklist.The Org data storage limit is the total amount of data that can be stored in a Salesforce Org, and it depends on the edition and license type of the Org1.The Bulk API limit on picklist fields is the maximum number of values that can be imported or exported using the Bulk API, and it is 1,000 values per picklist field2. These limitations could affect the performance and scalability of the data model, and the Data Architect should plan accordingly.

Cloud Kicks is launching a Partner Community, which will allow users to register shipment requests that are then processed by Cloud Kicks employees. Shipment requests contain header information, and then a list of no more than 5 items being shipped.

First, Cloud Kicks will introduce its community to 6,000 customers in North America, and then to 24,000 customers worldwide within the next two years. Cloud Kicks expects 12 shipment requests per week per customer, on average, and wants customers to be able to view up to three years of shipment requests and use Salesforce reports.

What is the recommended solution for the Cloud Kicks Data Architect to address the requirements?

A.
Create an external custom object to track shipment requests and a child external object to track shipment items. External objects are stored off-platform in Heroku's Postgres database.
A.
Create an external custom object to track shipment requests and a child external object to track shipment items. External objects are stored off-platform in Heroku's Postgres database.
Answers
B.
Create an external custom object to track shipment requests with five lookup custom fields for each item being shipped. External objects are stored off-platform in Heroku's Postgres database.
B.
Create an external custom object to track shipment requests with five lookup custom fields for each item being shipped. External objects are stored off-platform in Heroku's Postgres database.
Answers
C.
Create a custom object to track shipment requests and a child custom object to track shipment items. Implement an archiving process that moves data off-platform after three years.
C.
Create a custom object to track shipment requests and a child custom object to track shipment items. Implement an archiving process that moves data off-platform after three years.
Answers
D.
Create a custom object to track shipment requests with five lookup custom fields for each item being shipped Implement an archiving process that moves data off-platform after three years.
D.
Create a custom object to track shipment requests with five lookup custom fields for each item being shipped Implement an archiving process that moves data off-platform after three years.
Answers
Suggested answer: C

Explanation:

The recommended solution for the Cloud Kicks Data Architect to address the requirements is to create a custom object to track shipment requests and a child custom object to track shipment items. Implement an archiving process that moves data off-platform after three years. This solution would allow Cloud Kicks to store and manage their shipment data on Salesforce, and use Salesforce reports to analyze it.However, since Cloud Kicks expects a large volume of data over time, they should implement an archiving process that moves data off-platform after three years to avoid hitting the Org data storage limit and maintain optimal performance3.External objects are not a good option for this scenario, because they are stored off-platform in an external system, such as Heroku's Postgres database, and they have limited functionality and performance compared to custom objects

Universal Containers has successfully migrated 50 million records into five different objects multiple times in a full copy sandbox. The Integration Engineer wants to re-run the test again a month before it goes live into Production. What is the recommended approach to re-run the test?

A.
Truncate all 5 objects quickly and re-run the data migration test.
A.
Truncate all 5 objects quickly and re-run the data migration test.
Answers
B.
Refresh the full copy sandbox and re-run the data migration test.
B.
Refresh the full copy sandbox and re-run the data migration test.
Answers
C.
Hard delete all 5 objects' data and re-run the data migration test.
C.
Hard delete all 5 objects' data and re-run the data migration test.
Answers
D.
Truncate all 5 objects and hard delete before running the migration test.
D.
Truncate all 5 objects and hard delete before running the migration test.
Answers
Suggested answer: B

Explanation:

The recommended approach to re-run the test is to refresh the full copy sandbox and re-run the data migration test. A full copy sandbox is a replica of the production Org, including all data, metadata, and attachments. Refreshing a full copy sandbox means creating a new copy of the production Org and replacing the existing sandbox. This would ensure that the test is run on a clean and up-to-date environment, without any leftover data or configuration from previous tests. Truncating or hard deleting objects would not be sufficient, because they would not remove all the data or metadata from the sandbox, and they could also affect other dependent objects or processes.

Universal Containers is integrating a new Opportunity engagement system with Salesforce. According to their Master Data Management strategy, Salesforce is the system of record for Account, Contact, and Opportunity dat

a. However, there does seem to be valuable Opportunity data in the new system that potentially conflicts with what is stored in Salesforce. What is the recommended course of action to appropriately integrate this new system?

A.
The MDM strategy defines Salesforce as the system of record, so Salesforce Opportunity values prevail in all conflicts.
A.
The MDM strategy defines Salesforce as the system of record, so Salesforce Opportunity values prevail in all conflicts.
Answers
B.
A policy should be adopted so that the system whose record was most recently updated should prevail in conflicts.
B.
A policy should be adopted so that the system whose record was most recently updated should prevail in conflicts.
Answers
C.
The Opportunity engagement system should become the system of record for Opportunity records.
C.
The Opportunity engagement system should become the system of record for Opportunity records.
Answers
D.
Stakeholders should be brought together to discuss the appropriate data strategy moving forward.
D.
Stakeholders should be brought together to discuss the appropriate data strategy moving forward.
Answers
Suggested answer: D

Explanation:

The recommended course of action to appropriately integrate the new Opportunity engagement system with Salesforce is to bring the stakeholders together to discuss the appropriate data strategy moving forward. This is because there may be valuable data in both systems that need to be reconciled and harmonized, and the Master Data Management (MDM) strategy may need to be revised or updated to accommodate the new system.The other options are not recommended, as they may result in data loss, inconsistency, or duplication

For a production cutover, a large number of Account records will be loaded into Salesforce from a legacy system. The legacy system does not have enough information to determine the Ownership for these Accounts upon initial load. Which two recommended options assign Account ownership to mitigate potential performance problems?

A.
Let a ''system user'' own all the Account records without assigning any role to this user in Role Hierarchy.
A.
Let a ''system user'' own all the Account records without assigning any role to this user in Role Hierarchy.
Answers
B.
Let a ''system user'' own the Account records and assign this user to the lowest-level role in the Role Hierarchy.
B.
Let a ''system user'' own the Account records and assign this user to the lowest-level role in the Role Hierarchy.
Answers
C.
Let the VP of the Sales department, who will report directly to the senior VP, own all the Account records.
C.
Let the VP of the Sales department, who will report directly to the senior VP, own all the Account records.
Answers
D.
Let a ''system user'' own all the Account records and make this user part of the highest-level role in the Role Hierarchy.
D.
Let a ''system user'' own all the Account records and make this user part of the highest-level role in the Role Hierarchy.
Answers
Suggested answer: A, B

Explanation:

The two recommended options to assign Account ownership to mitigate potential performance problems are to let a ''system user'' own all the Account records without assigning any role to this user in Role Hierarchy, or to let a ''system user'' own the Account records and assign this user to the lowest-level role in the Role Hierarchy.This is because these options would reduce the number of sharing calculations and rules that need to be applied to the Account records, and improve the performance and scalability of the system34. The other options are not recommended, as they would increase the sharing complexity and overhead, and potentially expose sensitive data to unauthorized users.

Universal Containers (UC) is implementing its new Internet of Things technology, which consists of smart containers that provide information on container temperature and humidity updated every 10 minutes back to UC. There are roughly 10,000 containers equipped with this technology with the number expected to increase to 50,000 across the next five years. It is essential that Salesforce user have access to current and historical temperature and humidity data for each container. What is the recommended solution?

A.
Create new custom fields for temperature and humidity in the existing Container custom object, as well as an external ID field that is unique for each container. These custom fields are updated when a new measure is received.
A.
Create new custom fields for temperature and humidity in the existing Container custom object, as well as an external ID field that is unique for each container. These custom fields are updated when a new measure is received.
Answers
B.
Create a new Container Reading custom object, which is created when a new measure is received for a specific container. The Container Reading custom object has a master-detail relationship to the container object.
B.
Create a new Container Reading custom object, which is created when a new measure is received for a specific container. The Container Reading custom object has a master-detail relationship to the container object.
Answers
C.
Create a new Lightning Component that displays last humidity and temperature data for a specific container and can also display historical trends obtaining relevant data from UC's existing data warehouse.
C.
Create a new Lightning Component that displays last humidity and temperature data for a specific container and can also display historical trends obtaining relevant data from UC's existing data warehouse.
Answers
D.
Create a new Container Reading custom object with a master-detail relationship to Container which is created when a new measure is received for a specific container. Implement an archiving process that runs every hour.
D.
Create a new Container Reading custom object with a master-detail relationship to Container which is created when a new measure is received for a specific container. Implement an archiving process that runs every hour.
Answers
Suggested answer: D

Explanation:

The recommended solution for Universal Containers (UC) to implement its new Internet of Things technology is to create a new Container Reading custom object with a master-detail relationship to Container which is created when a new measure is received for a specific container. Implement an archiving process that runs every hour. This solution would allow UC to store and access current and historical temperature and humidity data for each container on Salesforce, and use reports and dashboards to analyze it. However, since UC expects a large volume of data over time, they should implement an archiving process that moves data off-platform after a certain period of time to avoid hitting the Org data storage limit and maintain optimal performance. The other options are not recommended, as they would either not store the historical data on Salesforce, or create too many custom fields on the Container object that could impact performance and usability.

Universal Containers is planning out their archiving and purging plans going forward for their custom objects Topic__c and Comment__c. Several options are being considered, including analytics snapshots, offsite storage, scheduled purges, etc. Which three questions should be considered when designing an appropriate archiving strategy?

A.
How many fields are defined on the custom objects that need to be archived?
A.
How many fields are defined on the custom objects that need to be archived?
Answers
B.
Which profiles and users currently have access to these custom object records?
B.
Which profiles and users currently have access to these custom object records?
Answers
C.
If reporting is necessary, can the information be aggregated into fewer, summary records?
C.
If reporting is necessary, can the information be aggregated into fewer, summary records?
Answers
D.
Will the data being archived need to be reported on or accessed in any way in the future?
D.
Will the data being archived need to be reported on or accessed in any way in the future?
Answers
E.
Are there any regulatory restrictions that will influence the archiving and purging plans?
E.
Are there any regulatory restrictions that will influence the archiving and purging plans?
Answers
Suggested answer: C, D, E

Explanation:

The three questions that should be considered when designing an appropriate archiving strategy are: If reporting is necessary, can the information be aggregated into fewer, summary records? Will the data being archived need to be reported on or accessed in any way in the future? Are there any regulatory restrictions that will influence the archiving and purging plans?These questions are important because they help determine the scope, frequency, and method of archiving and purging data from Salesforce. For example, if reporting is necessary, then summary records or analytics snapshots can be used to store aggregated data and reduce the number of records that need to be archived1.If the data being archived needs to be accessed in the future, then offsite storage or external objects can be used to retain the data and make it available on demand2.If there are any regulatory restrictions, such as GDPR or HIPAA, then the archiving and purging plans need to comply with them and ensure data security and privacy

Total 260 questions
Go to page: of 26