Salesforce Certified Data Architect Practice Test - Questions Answers, Page 16
List of questions
Question 151

NTO has decided that it is going to build a channel sales portal with the following requirements:
1. External resellers are able to authenticate to the portal with a login.
2. Lead data, opportunity data and order data are available to authenticated users.
3. Authenticated users many need to run reports and dashboards.
4. There is no need for more than 10 custom objects or additional file storage.
Which community cloud license type should a data architect recommend to meet the portal requirements?
Explanation:
Partner community license type is the best option for building a channel sales portal, as it allows external resellers to access lead, opportunity, and order data, as well as run reports and dashboards.Customer community and customer community plus license types are more suitable for customer service portals, while lightning external apps starter license type does not support reports and dashboards
Question 152

UC is implementing sales cloud for patient management and would like to encrypt sensitive patient records being stored in files.
Which solution should a data architect recommend to solve this requirement?
Explanation:
Shield platform encryption is the recommended solution for encrypting sensitive patient records stored in files, as it provides encryption at rest for files and attachments, as well as standard and custom fields. Classic encryption only supports text fields with a maximum length of 175 characters, and does not encrypt files. Third-party App Exchange apps may not provide the same level of security and compliance as shield platform encryption.Storing files outside of salesforce may introduce additional complexity and latency
Question 153

UC recently migrated 1 Billion customer related records from a legacy data store to Heroku Postgres. A subset of the data need to be synchronized with salesforce so that service agents are able to support customers directly within the service console. The remaining non- synchronized set of data will need to be accessed by salesforce at any point in time, but UC management is concerned about storage limitations.
What should a data architect recommend to meet these requirements with minimal effort?
Explanation:
Virtualizing the remaining set of data with salesforce connect and external objects is the best way to meet the requirements with minimal effort, as it allows salesforce to access data stored in Heroku Postgres without storing it in salesforce. This reduces the storage limitations and avoids data duplication. Heroku connect can bi-directionally sync data between systems, but it requires more configuration and maintenance. Making callouts to Heroku Postgres and persisting the data in salesforce may not be feasible for 1 billion records.Migrating the data to big objects may incur additional costs and require custom code to use async SOQL
Question 154

UC is building a salesforce application to track contacts and their respective conferences that they have attended with the following requirements:
1. Contacts will be stored in the standard contact object.
2. Conferences will be stored in a custom conference_ c object.
3. Each contact may attend multiple conferences and each conference may be related to multiple contacts.
How should a data architect model the relationship between the contact and conference objects?
Explanation:
Implementing a Contact Conference junction object with master detail relationship to both contact and conference_c is the correct way to model the relationship between the contact and conference objects, as it allows a many-to-many relationship between them. This means that each contact can attend multiple conferences, and each conference can be related to multiple contacts. Creating a master detail relationship field on either the contact or the conference object would create a one-to-many relationship, which does not meet the requirements. Creating a lookup relationship field on contact object would also create a one-to-many relationship, and would not enforce referential integrity.
Question 155

UC has a requirement to migrate 100 million order records from a legacy ERP application into the salesforce platform. UC does not have any requirements around reporting on the migrated data.
What should a data architect recommend to reduce the performance degradation of the platform?
Explanation:
Implementing a custom big object to store the data is the best recommendation to reduce the performance degradation of the platform, as it allows storing large volumes of data that do not need real-time access or reporting. Custom big objects can be defined using metadata API or developer console, and support up to 1 billion records per object. Creating a custom object or using the standard order object would consume a lot of storage space and impact the performance of queries and reports. Using a standard big object defined by salesforce would not be applicable for order records, as standard big objects are predefined for specific use cases such as audit trails or field history.
Question 156

NTO has outgrown its current salesforce org and will be migrating to new org shortly. As part of this process NTO will be migrating all of its metadata and dat
a. NTO's data model in the source org has a complex relationship hierarchy with several master detail and lookup relationships across objects, which should be maintained in target org.
What 3 things should a data architect do to maintain the relationship hierarchy during migration?
Choose 3 answers:
Explanation:
The correct answer is A, B, and D. To maintain the relationship hierarchy during migration, a data architect should use data loader to export the data from source org and then import or upsert into the target org in sequential order, create an external ID field for each object in the target org and map source record IDs to this field, and replace source record IDs with new record IDs from the target org in the import file. These steps will ensure that the records are linked correctly and the relationships are preserved. Option C is incorrect because redefining the master detail relationship fields to lookup relationship fields in the target org will change the behavior and security of the data model. Option E is incorrect because keeping the relationship fields populated with the source record IDs in the import file will cause errors and prevent the records from being imported.
Question 157

UC has millions of Cases and are running out of storage. Some user groups need to have access to historical cases for up to 7 years.
Which 2 solutions should a data architect recommend in order to minimize performance and storage issues?
Choose 2 answers:
Explanation:
The correct answer is C and D. To minimize performance and storage issues, a data architect should recommend leveraging on premise data archival and building integration to view archived data, and leveraging big object to archive case data and lightning components to show archived data. These solutions will allow some user groups to access historical cases for up to 7 years without consuming too much storage space or affecting the performance of queries and reports. Option A is incorrect because exporting data out of salesforce and storing it in flat files on external system will make it difficult to access and query the data. Option B is incorrect because creating a custom object to store case history and run reports on it will still consume a lot of storage space and impact the performance of queries and reports.
Question 158

What should a data architect do to provide additional guidance for users when they enter information in a standard field?
Explanation:
The correct answer is A. To provide additional guidance for users when they enter information in a standard field, a data architect should provide custom help text under field properties. This will display a help icon next to the field label that users can hover over to see the help text. Option B is incorrect because creating a custom page with help text for user guidance will require additional development effort and may not be easily accessible by users. Option C is incorrect because adding custom help text in default value for the field will overwrite the actual default value of the field and may confuse users. Option D is incorrect because adding a label field with help text adjacent to the custom field will clutter the page layout and may not be visible to users.
Question 159

Universal Containers (UC) is going thought major reorganization of their sales team. This would require changes to a large a number of group members and sharing rules. UCs administrator is concerned about long processing time and failure during the process.
What should a Data architect implement to make changes efficiently?
Explanation:
The correct answer is B. To make changes efficiently, a data architect should enable Defer Sharing Calculation prior to making sharing rule changes. This will allow the administrator to make multiple changes to sharing rules without recalculating them after each change, which can take a long time and cause failures. The sharing rules can be recalculated later when there are fewer users online or during off-peak hours. Option A is incorrect because logging a case with salesforce to make sharing rule changes will not speed up the process or prevent failures. Option C is incorrect because deleting old sharing rules and building new sharing rules will not reduce the processing time or failure rate. Option D is incorrect because logging out all users and making changes to sharing rules will disrupt the business operations and may not be feasible.
Question 160

Universal Container (UC) stores 10 million rows of inventory data in a cloud database, As part of creating a connected experience in Salesforce, UC would like to this inventory data to Sales Cloud without a import. UC has asked its data architect to determine if Salesforce Connect is needed.
Which three consideration should the data architect make when evaluating the need for Salesforce Connect?
Explanation:
The correct answer is A, D, and E. The data architect should consider these three factors when evaluating the need for Salesforce Connect: You want real-time access to the latest data from other systems, you have a large amount of data that you don't want to copy into your Salesforce org, and you need to small amounts of external data at any one time. These factors indicate that Salesforce Connect is a suitable solution for creating a connected experience in Salesforce without importing inventory data from a cloud database. Salesforce Connect allows Salesforce to access external data via OData or custom adapters without storing it in Salesforce, which reduces storage costs and ensures data freshness. Salesforce Connect also supports pagination and caching to optimize performance when accessing small amounts of external data at any one time. Option B is incorrect because if you have a large amount of data and would like to copy subsets of it into Salesforce, you may not need Salesforce Connect but rather use other tools such as Data Loader or API integration. Option C is incorrect because if you need to expose data via a virtual private connection, you may not need Salesforce Connect but rather use other tools such as VPN or VPC peering.
Question