Salesforce Certified Data Architect Practice Test - Questions Answers, Page 9
List of questions
Question 81

Universal Containers (UC) is expecting to have nearly 5 million shipments records in its Salesforce org. Each shipment record has up to 10 child shipment item records. The Shipment custom object has an Organization-wide Default (OWD) sharing model set to Private and the Shipment Item custom object has a Master-Detail relationship to Shipment. There are 25 sharing rules set on the Shipment custom object, which allow shipment records to be shared to each of UC's 25 business areas around the globe. These sharing rules use public groups, one for each business area plus a number of groups for management and support roles. UC has a high turnover of Sales Reps and often needs to move Sales Reps between business areas in order to meet local demand. What feature would ensure that performance, when moving Sales Reps between regions, remains adequate while meeting existing requirements?
Explanation:
Contacting Salesforce to enable Defer Sharing Rules is the feature that would ensure that performance, when moving Sales Reps between regions, remains adequate while meeting existing requirements. Defer Sharing Rules allows you to defer sharing rule recalculation when you make changes to public groups or roles that are used in sharing rules.This can improve performance and avoid locking issues when you move users between groups or roles2. Implementing data archiving for old Shipment records will not help with the performance issue related to sharing rules. Contacting Salesforce to create Skinny tables on Shipment will not help with the performance issue related to sharing rules. Configuring shipment OWD to Public Read/Write will not meet the existing requirements of having a Private sharing model.
Question 82

A customer is facing locking issued when importing large data volumes of order records that are children in a master-detail relationship with the Account object. What is the recommended way to avoid locking issues during import?
Explanation:
Importing Account records first followed by order records after sorting orders by AccountID is the recommended way to avoid locking issues during import. This can reduce the number of lock contention errors by minimizing the number of parent records that are concurrently processed by multiple batches.Sorting orders by AccountID can also group the child records by their parent records and avoid updating the same parent record in different batches3. Importing Account records first followed by order records after sorting order by OrderID will not help with avoiding locking issues because it does not group the child records by their parent records. Changing the relationship to Lookup and updating the relationship to master-detail after import will not work because changing a relationship from Lookup to master-detail requires that all child records have a parent record, which may not be the case after import. Importing Order records and Account records separately and populating AccountID in orders using batch Apex will not help with avoiding locking issues because it still requires updating the parent records in batches.
Question 83

Universal Containers wants to develop a dashboard in Salesforce that will allow Sales Managers to do data exploration using their mobile device (i.e., drill down into sales-related data) and have the possibility of adding ad-hoc filters while on the move. What is a recommended solution for building data exploration dashboards in Salesforce?
Explanation:
Creating a Dashboard using Analytics Cloud that will allow the user to create ad-hoc lenses and drill down is a recommended solution for building data exploration dashboards in Salesforce. Analytics Cloud is a powerful data analysis tool that enables users to explore data using interactive dashboards, charts, graphs, and tables on any device. Users can also create lenses, which are ad-hoc data queries that can be saved and reused, and drill down into data details using filters and facets. Creating a Dashboard in an external reporting tool, exporting data to the tool, and adding link to the dashboard in Salesforce will not provide a seamless user experience and may require additional data integration and security considerations. Creating a Dashboard in an external reporting tool, exporting data to the tool, and embedding the dashboard in Salesforce using the Canval toolkit will not provide a native Salesforce solution and may require additional data integration and security considerations. Creating a standard Salesforce Dashboard and connecting it to reports with the appropriate filters will not allow the user to create ad-hoc lenses and drill down into data details on their mobile device.
Question 84

DreamHouse Realty has a legacy system that captures Branch Offices and Transactions. DreamHouse Realty has 15 Branch Offices. Transactions can relate to any Branch Office. DreamHouse Realty has created hundreds of thousands of Transactions per year.
A Data Architect needs to denormalize this data model into a single Transaction object with a Branch Office picklist.
What are two important considerations for the Data Architect in this scenario? (Choose two.)
Explanation:
The Data Architect should consider the limitations on Org data storage and the Bulk API limitations on picklist fields when denormalizing the data model into a single Transaction object with a Branch Office picklist.The Org data storage limit is the total amount of data that can be stored in a Salesforce Org, and it depends on the edition and license type of the Org1.The Bulk API limit on picklist fields is the maximum number of values that can be imported or exported using the Bulk API, and it is 1,000 values per picklist field2. These limitations could affect the performance and scalability of the data model, and the Data Architect should plan accordingly.
Question 85

Cloud Kicks is launching a Partner Community, which will allow users to register shipment requests that are then processed by Cloud Kicks employees. Shipment requests contain header information, and then a list of no more than 5 items being shipped.
First, Cloud Kicks will introduce its community to 6,000 customers in North America, and then to 24,000 customers worldwide within the next two years. Cloud Kicks expects 12 shipment requests per week per customer, on average, and wants customers to be able to view up to three years of shipment requests and use Salesforce reports.
What is the recommended solution for the Cloud Kicks Data Architect to address the requirements?
Explanation:
The recommended solution for the Cloud Kicks Data Architect to address the requirements is to create a custom object to track shipment requests and a child custom object to track shipment items. Implement an archiving process that moves data off-platform after three years. This solution would allow Cloud Kicks to store and manage their shipment data on Salesforce, and use Salesforce reports to analyze it.However, since Cloud Kicks expects a large volume of data over time, they should implement an archiving process that moves data off-platform after three years to avoid hitting the Org data storage limit and maintain optimal performance3.External objects are not a good option for this scenario, because they are stored off-platform in an external system, such as Heroku's Postgres database, and they have limited functionality and performance compared to custom objects
Question 86

Universal Containers has successfully migrated 50 million records into five different objects multiple times in a full copy sandbox. The Integration Engineer wants to re-run the test again a month before it goes live into Production. What is the recommended approach to re-run the test?
Explanation:
The recommended approach to re-run the test is to refresh the full copy sandbox and re-run the data migration test. A full copy sandbox is a replica of the production Org, including all data, metadata, and attachments. Refreshing a full copy sandbox means creating a new copy of the production Org and replacing the existing sandbox. This would ensure that the test is run on a clean and up-to-date environment, without any leftover data or configuration from previous tests. Truncating or hard deleting objects would not be sufficient, because they would not remove all the data or metadata from the sandbox, and they could also affect other dependent objects or processes.
Question 87

Universal Containers is integrating a new Opportunity engagement system with Salesforce. According to their Master Data Management strategy, Salesforce is the system of record for Account, Contact, and Opportunity dat
a. However, there does seem to be valuable Opportunity data in the new system that potentially conflicts with what is stored in Salesforce. What is the recommended course of action to appropriately integrate this new system?
Explanation:
The recommended course of action to appropriately integrate the new Opportunity engagement system with Salesforce is to bring the stakeholders together to discuss the appropriate data strategy moving forward. This is because there may be valuable data in both systems that need to be reconciled and harmonized, and the Master Data Management (MDM) strategy may need to be revised or updated to accommodate the new system.The other options are not recommended, as they may result in data loss, inconsistency, or duplication
Question 88

For a production cutover, a large number of Account records will be loaded into Salesforce from a legacy system. The legacy system does not have enough information to determine the Ownership for these Accounts upon initial load. Which two recommended options assign Account ownership to mitigate potential performance problems?
Explanation:
The two recommended options to assign Account ownership to mitigate potential performance problems are to let a ''system user'' own all the Account records without assigning any role to this user in Role Hierarchy, or to let a ''system user'' own the Account records and assign this user to the lowest-level role in the Role Hierarchy.This is because these options would reduce the number of sharing calculations and rules that need to be applied to the Account records, and improve the performance and scalability of the system34. The other options are not recommended, as they would increase the sharing complexity and overhead, and potentially expose sensitive data to unauthorized users.
Question 89

Universal Containers (UC) is implementing its new Internet of Things technology, which consists of smart containers that provide information on container temperature and humidity updated every 10 minutes back to UC. There are roughly 10,000 containers equipped with this technology with the number expected to increase to 50,000 across the next five years. It is essential that Salesforce user have access to current and historical temperature and humidity data for each container. What is the recommended solution?
Explanation:
The recommended solution for Universal Containers (UC) to implement its new Internet of Things technology is to create a new Container Reading custom object with a master-detail relationship to Container which is created when a new measure is received for a specific container. Implement an archiving process that runs every hour. This solution would allow UC to store and access current and historical temperature and humidity data for each container on Salesforce, and use reports and dashboards to analyze it. However, since UC expects a large volume of data over time, they should implement an archiving process that moves data off-platform after a certain period of time to avoid hitting the Org data storage limit and maintain optimal performance. The other options are not recommended, as they would either not store the historical data on Salesforce, or create too many custom fields on the Container object that could impact performance and usability.
Question 90

Universal Containers is planning out their archiving and purging plans going forward for their custom objects Topic__c and Comment__c. Several options are being considered, including analytics snapshots, offsite storage, scheduled purges, etc. Which three questions should be considered when designing an appropriate archiving strategy?
Explanation:
The three questions that should be considered when designing an appropriate archiving strategy are: If reporting is necessary, can the information be aggregated into fewer, summary records? Will the data being archived need to be reported on or accessed in any way in the future? Are there any regulatory restrictions that will influence the archiving and purging plans?These questions are important because they help determine the scope, frequency, and method of archiving and purging data from Salesforce. For example, if reporting is necessary, then summary records or analytics snapshots can be used to store aggregated data and reduce the number of records that need to be archived1.If the data being archived needs to be accessed in the future, then offsite storage or external objects can be used to retain the data and make it available on demand2.If there are any regulatory restrictions, such as GDPR or HIPAA, then the archiving and purging plans need to comply with them and ensure data security and privacy
Question