ExamGecko
Home Home / Salesforce / Certified Data Architect

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 16

Question list
Search
Search

List of questions

Search

Related questions











NTO has decided that it is going to build a channel sales portal with the following requirements:

1. External resellers are able to authenticate to the portal with a login.

2. Lead data, opportunity data and order data are available to authenticated users.

3. Authenticated users many need to run reports and dashboards.

4. There is no need for more than 10 custom objects or additional file storage.

Which community cloud license type should a data architect recommend to meet the portal requirements?

A.
Customer community.
A.
Customer community.
Answers
B.
Lightning external apps starter.
B.
Lightning external apps starter.
Answers
C.
Customer community plus.
C.
Customer community plus.
Answers
D.
Partner community.
D.
Partner community.
Answers
Suggested answer: D

Explanation:

Partner community license type is the best option for building a channel sales portal, as it allows external resellers to access lead, opportunity, and order data, as well as run reports and dashboards.Customer community and customer community plus license types are more suitable for customer service portals, while lightning external apps starter license type does not support reports and dashboards

UC is implementing sales cloud for patient management and would like to encrypt sensitive patient records being stored in files.

Which solution should a data architect recommend to solve this requirement?

A.
Implement shield platform encryption to encrypt files.
A.
Implement shield platform encryption to encrypt files.
Answers
B.
Use classic encryption to encrypt files.
B.
Use classic encryption to encrypt files.
Answers
C.
Implement 3rd party App Exchange app to encrypt files.
C.
Implement 3rd party App Exchange app to encrypt files.
Answers
D.
Store files outside of salesforce and access them to real time.
D.
Store files outside of salesforce and access them to real time.
Answers
Suggested answer: A

Explanation:

Shield platform encryption is the recommended solution for encrypting sensitive patient records stored in files, as it provides encryption at rest for files and attachments, as well as standard and custom fields. Classic encryption only supports text fields with a maximum length of 175 characters, and does not encrypt files. Third-party App Exchange apps may not provide the same level of security and compliance as shield platform encryption.Storing files outside of salesforce may introduce additional complexity and latency

UC recently migrated 1 Billion customer related records from a legacy data store to Heroku Postgres. A subset of the data need to be synchronized with salesforce so that service agents are able to support customers directly within the service console. The remaining non- synchronized set of data will need to be accessed by salesforce at any point in time, but UC management is concerned about storage limitations.

What should a data architect recommend to meet these requirements with minimal effort?

A.
Virtualize the remaining set of data with salesforce connect and external objects.
A.
Virtualize the remaining set of data with salesforce connect and external objects.
Answers
B.
Use Heroku connect to bi-directional, sync all data between systems.
B.
Use Heroku connect to bi-directional, sync all data between systems.
Answers
C.
As needed, make call outs into Heroku postgres and persist the data in salesforce.
C.
As needed, make call outs into Heroku postgres and persist the data in salesforce.
Answers
D.
Migrate the data to big objects and leverage async SOQL with custom objects.
D.
Migrate the data to big objects and leverage async SOQL with custom objects.
Answers
Suggested answer: A

Explanation:

Virtualizing the remaining set of data with salesforce connect and external objects is the best way to meet the requirements with minimal effort, as it allows salesforce to access data stored in Heroku Postgres without storing it in salesforce. This reduces the storage limitations and avoids data duplication. Heroku connect can bi-directionally sync data between systems, but it requires more configuration and maintenance. Making callouts to Heroku Postgres and persisting the data in salesforce may not be feasible for 1 billion records.Migrating the data to big objects may incur additional costs and require custom code to use async SOQL

UC is building a salesforce application to track contacts and their respective conferences that they have attended with the following requirements:

1. Contacts will be stored in the standard contact object.

2. Conferences will be stored in a custom conference_ c object.

3. Each contact may attend multiple conferences and each conference may be related to multiple contacts.

How should a data architect model the relationship between the contact and conference objects?

A.
Implement a Contact Conference junction object with master detail relationship to both contact and conference_c
A.
Implement a Contact Conference junction object with master detail relationship to both contact and conference_c
Answers
B.
Create a master detail relationship field on the Contact object.
B.
Create a master detail relationship field on the Contact object.
Answers
C.
Create a master detail relationship field on the Conference object.
C.
Create a master detail relationship field on the Conference object.
Answers
D.
Create a lookup relationship field on contact object.
D.
Create a lookup relationship field on contact object.
Answers
Suggested answer: A

Explanation:

Implementing a Contact Conference junction object with master detail relationship to both contact and conference_c is the correct way to model the relationship between the contact and conference objects, as it allows a many-to-many relationship between them. This means that each contact can attend multiple conferences, and each conference can be related to multiple contacts. Creating a master detail relationship field on either the contact or the conference object would create a one-to-many relationship, which does not meet the requirements. Creating a lookup relationship field on contact object would also create a one-to-many relationship, and would not enforce referential integrity.

UC has a requirement to migrate 100 million order records from a legacy ERP application into the salesforce platform. UC does not have any requirements around reporting on the migrated data.

What should a data architect recommend to reduce the performance degradation of the platform?

A.
Create a custom object to store the data.
A.
Create a custom object to store the data.
Answers
B.
Use a standard big object defined by salesforce.
B.
Use a standard big object defined by salesforce.
Answers
C.
Use the standard ''Order'' object to store the data.
C.
Use the standard ''Order'' object to store the data.
Answers
D.
Implement a custom big object to store the data.
D.
Implement a custom big object to store the data.
Answers
Suggested answer: D

Explanation:

Implementing a custom big object to store the data is the best recommendation to reduce the performance degradation of the platform, as it allows storing large volumes of data that do not need real-time access or reporting. Custom big objects can be defined using metadata API or developer console, and support up to 1 billion records per object. Creating a custom object or using the standard order object would consume a lot of storage space and impact the performance of queries and reports. Using a standard big object defined by salesforce would not be applicable for order records, as standard big objects are predefined for specific use cases such as audit trails or field history.

NTO has outgrown its current salesforce org and will be migrating to new org shortly. As part of this process NTO will be migrating all of its metadata and dat

a. NTO's data model in the source org has a complex relationship hierarchy with several master detail and lookup relationships across objects, which should be maintained in target org.

What 3 things should a data architect do to maintain the relationship hierarchy during migration?

Choose 3 answers:

A.
Use data loader to export the data from source org and then import or Upsert into the target org in sequential order.
A.
Use data loader to export the data from source org and then import or Upsert into the target org in sequential order.
Answers
B.
Create a external id field for each object in the target org and map source record ID's to this field.
B.
Create a external id field for each object in the target org and map source record ID's to this field.
Answers
C.
Redefine the master detail relationship fields to lookup relationship fields in the target org.
C.
Redefine the master detail relationship fields to lookup relationship fields in the target org.
Answers
D.
Replace source record ID's with new record ID's from the target org in the import file.
D.
Replace source record ID's with new record ID's from the target org in the import file.
Answers
E.
Keep the relationship fields populated with the source record ID's in the import file.
E.
Keep the relationship fields populated with the source record ID's in the import file.
Answers
Suggested answer: A, B, D

Explanation:

The correct answer is A, B, and D. To maintain the relationship hierarchy during migration, a data architect should use data loader to export the data from source org and then import or upsert into the target org in sequential order, create an external ID field for each object in the target org and map source record IDs to this field, and replace source record IDs with new record IDs from the target org in the import file. These steps will ensure that the records are linked correctly and the relationships are preserved. Option C is incorrect because redefining the master detail relationship fields to lookup relationship fields in the target org will change the behavior and security of the data model. Option E is incorrect because keeping the relationship fields populated with the source record IDs in the import file will cause errors and prevent the records from being imported.

UC has millions of Cases and are running out of storage. Some user groups need to have access to historical cases for up to 7 years.

Which 2 solutions should a data architect recommend in order to minimize performance and storage issues?

Choose 2 answers:

A.
Export data out of salesforce and store in Flat files on external system.
A.
Export data out of salesforce and store in Flat files on external system.
Answers
B.
Create a custom object to store case history and run reports on it.
B.
Create a custom object to store case history and run reports on it.
Answers
C.
Leverage on premise data archival and build integration to view archived data.
C.
Leverage on premise data archival and build integration to view archived data.
Answers
D.
Leverage big object to archive case data and lightning components to show archived data.
D.
Leverage big object to archive case data and lightning components to show archived data.
Answers
Suggested answer: C, D

Explanation:

The correct answer is C and D. To minimize performance and storage issues, a data architect should recommend leveraging on premise data archival and building integration to view archived data, and leveraging big object to archive case data and lightning components to show archived data. These solutions will allow some user groups to access historical cases for up to 7 years without consuming too much storage space or affecting the performance of queries and reports. Option A is incorrect because exporting data out of salesforce and storing it in flat files on external system will make it difficult to access and query the data. Option B is incorrect because creating a custom object to store case history and run reports on it will still consume a lot of storage space and impact the performance of queries and reports.

What should a data architect do to provide additional guidance for users when they enter information in a standard field?

A.
Provide custom help text under field properties.
A.
Provide custom help text under field properties.
Answers
B.
Create a custom page with help text for user guidance.
B.
Create a custom page with help text for user guidance.
Answers
C.
Add custom help text in default value for the field.
C.
Add custom help text in default value for the field.
Answers
D.
Add a label field with help text adjacent to the custom field.
D.
Add a label field with help text adjacent to the custom field.
Answers
Suggested answer: A

Explanation:

The correct answer is A. To provide additional guidance for users when they enter information in a standard field, a data architect should provide custom help text under field properties. This will display a help icon next to the field label that users can hover over to see the help text. Option B is incorrect because creating a custom page with help text for user guidance will require additional development effort and may not be easily accessible by users. Option C is incorrect because adding custom help text in default value for the field will overwrite the actual default value of the field and may confuse users. Option D is incorrect because adding a label field with help text adjacent to the custom field will clutter the page layout and may not be visible to users.

Universal Containers (UC) is going thought major reorganization of their sales team. This would require changes to a large a number of group members and sharing rules. UCs administrator is concerned about long processing time and failure during the process.

What should a Data architect implement to make changes efficiently?

A.
Log a case with salesforce to make sharing rule changes.
A.
Log a case with salesforce to make sharing rule changes.
Answers
B.
Enable Defer Sharing Calculation prior to making sharing rule changes.
B.
Enable Defer Sharing Calculation prior to making sharing rule changes.
Answers
C.
Delete old sharing rules and build new sharing rules
C.
Delete old sharing rules and build new sharing rules
Answers
D.
Log out all users and make changes to sharing rules.
D.
Log out all users and make changes to sharing rules.
Answers
Suggested answer: B

Explanation:

The correct answer is B. To make changes efficiently, a data architect should enable Defer Sharing Calculation prior to making sharing rule changes. This will allow the administrator to make multiple changes to sharing rules without recalculating them after each change, which can take a long time and cause failures. The sharing rules can be recalculated later when there are fewer users online or during off-peak hours. Option A is incorrect because logging a case with salesforce to make sharing rule changes will not speed up the process or prevent failures. Option C is incorrect because deleting old sharing rules and building new sharing rules will not reduce the processing time or failure rate. Option D is incorrect because logging out all users and making changes to sharing rules will disrupt the business operations and may not be feasible.

Universal Container (UC) stores 10 million rows of inventory data in a cloud database, As part of creating a connected experience in Salesforce, UC would like to this inventory data to Sales Cloud without a import. UC has asked its data architect to determine if Salesforce Connect is needed.

Which three consideration should the data architect make when evaluating the need for Salesforce Connect?

A.
You want real-time access to the latest data, from other systems.
A.
You want real-time access to the latest data, from other systems.
Answers
B.
You have a large amount of data and would like to copy subsets of it into Salesforce.
B.
You have a large amount of data and would like to copy subsets of it into Salesforce.
Answers
C.
You need to expose data via a virtual private connection.
C.
You need to expose data via a virtual private connection.
Answers
D.
You have a large amount of data that you don't want to copy into your Salesforce org.
D.
You have a large amount of data that you don't want to copy into your Salesforce org.
Answers
E.
You need to small amounts of external data at any one time.
E.
You need to small amounts of external data at any one time.
Answers
Suggested answer: A, D, E

Explanation:

The correct answer is A, D, and E. The data architect should consider these three factors when evaluating the need for Salesforce Connect: You want real-time access to the latest data from other systems, you have a large amount of data that you don't want to copy into your Salesforce org, and you need to small amounts of external data at any one time. These factors indicate that Salesforce Connect is a suitable solution for creating a connected experience in Salesforce without importing inventory data from a cloud database. Salesforce Connect allows Salesforce to access external data via OData or custom adapters without storing it in Salesforce, which reduces storage costs and ensures data freshness. Salesforce Connect also supports pagination and caching to optimize performance when accessing small amounts of external data at any one time. Option B is incorrect because if you have a large amount of data and would like to copy subsets of it into Salesforce, you may not need Salesforce Connect but rather use other tools such as Data Loader or API integration. Option C is incorrect because if you need to expose data via a virtual private connection, you may not need Salesforce Connect but rather use other tools such as VPN or VPC peering.

Total 260 questions
Go to page: of 26