ExamGecko
Home Home / Salesforce / Certified Data Architect

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 14

Question list
Search
Search

List of questions

Search

Related questions











UC has a variety of systems across its technology landscape, including SF, legacy enterprise resource planning (ERP) applications and homegrown CRM tools. UC has decided that they would like to consolidate all customer, opportunity and order data into Salesforce as part of its master data management (MDM) strategy.

What are the 3 key steps that a data architect should take when merging data from multiple systems into Salesforce? Choose 3 answers:

A.
Create new fields to store additional values from all the systems.
A.
Create new fields to store additional values from all the systems.
Answers
B.
Install a 3rd party AppExchange tool to handle the merger
B.
Install a 3rd party AppExchange tool to handle the merger
Answers
C.
Analyze each system's data model and perform gap analysis
C.
Analyze each system's data model and perform gap analysis
Answers
D.
Utilize an ETL tool to merge, transform and de-duplicate data.
D.
Utilize an ETL tool to merge, transform and de-duplicate data.
Answers
E.
Work with Stakeholders to define record and field survivorship rules
E.
Work with Stakeholders to define record and field survivorship rules
Answers
Suggested answer: C, D, E

Explanation:

The three key steps that a data architect should take when merging data from multiple systems into Salesforce are:

Analyze each system's data model and perform gap analysis. This step involves understanding the structure and meaning of the data in each system, identifying the common and unique data elements, and mapping the data fields between the systems. This step also involves assessing the quality and consistency of the data, and identifying any data cleansing or transformation needs.

Utilize an ETL tool to merge, transform, and de-duplicate data. This step involves using an ETL tool to connect to the source systems, extract the data, apply any data transformations or validations, and load the data into Salesforce. This step also involves applying de-duplication rules or algorithms to avoid creating duplicate records in Salesforce.

Work with stakeholders to define record and field survivorship rules. This step involves collaborating with the business users and owners of the data to determine which records and fields should be retained or overwritten in case of conflicts or discrepancies. This step also involves defining the criteria and logic for record and field survivorship, and implementing them in the ETL tool or in Salesforce.

Creating new fields to store additional values from all the systems is not a key step, but rather a possible outcome of the gap analysis. It may not be necessary or desirable to create new fields for every value from every system, as it may result in redundant or irrelevant data. Installing a 3rd party AppExchange tool to handle the merger is not a key step, but rather a possible option for choosing an ETL tool. It may not be the best option depending on the requirements, budget, and preferences of the organization.

UC has a roll-up summary field on Account to calculate the count of contacts associated with an account. During the account load, SF is throwing an ''Unable to lock a row'' error.

Which solution should a data architect recommend, to resolve the error?

A.
Leverage data loader platform API to load data.
A.
Leverage data loader platform API to load data.
Answers
B.
Perform Batch job in parallel mode and reduce Batch size
B.
Perform Batch job in parallel mode and reduce Batch size
Answers
C.
Perform Batch job in serial mode and reduce batch size
C.
Perform Batch job in serial mode and reduce batch size
Answers
D.
Defer roll-up summary fields calculation during data migration.
D.
Defer roll-up summary fields calculation during data migration.
Answers
Suggested answer: C

Explanation:

The best solution to resolve the error of ''Unable to lock a row'' during the account load is to perform batch job in serial mode and reduce batch size. This is because roll-up summary fields are calculated synchronously when the parent record is updated, and asynchronously when the child record is updated. Therefore, updating many child records at once can cause locking issues on the parent record.To avoid this, it is recommended to use serial mode and smaller batch sizes when loading data using tools like Data Loader or Bulk API12. Leverage data loader platform API to load data is not a good option because it does not specify the mode or batch size. Perform batch job in parallel mode and reduce batch size is not a good option because parallel mode can still cause locking issues even with smaller batches.Defer roll-up summary fields calculation during data migration is not a good option because it is not possible to defer or disable roll-up summary fields calculation

UC is migrating individual customers (B2C) data from legacy systems to SF. There are millions of customers stored as accounts and contacts in legacy database.

Which object model should a data architect configure within SF?

A.
Leverage person account object in Salesforce
A.
Leverage person account object in Salesforce
Answers
B.
Leverage custom person account object in SF
B.
Leverage custom person account object in SF
Answers
C.
Leverage custom account and contact object in SF
C.
Leverage custom account and contact object in SF
Answers
D.
Leverage standard account and contact object in SF
D.
Leverage standard account and contact object in SF
Answers
Suggested answer: A

Explanation:

The best object model to configure within SF for migrating individual customers (B2C) data from legacy systems is to leverage person account object in Salesforce.Person accounts are a special type of accounts that store information about individual people by combining certain account and contact fields into a single record4. Person accounts are useful for B2C scenarios where there is no need to associate a company name with a contact.Person accounts also support standard Salesforce features and functionality, such as leads, campaigns, reports, dashboards, etc5. Leverage custom person account object in SF is not a good option because there is no such thing as a custom person account object. Leverage custom account and contact object in SF is not a good option because it would require creating and maintaining additional objects and fields that may not be necessary or compatible with standard Salesforce features. Leverage standard account and contact object in SF is not a good option because it would require filling in dummy values for the account name field, which is mandatory for standard accounts.

NTO has 1 million customer records spanning 25 years. As part of its new SF project, NTO would like to create a master data management strategy to help preserve the history and relevance of its customer data.

Which 3 activities will be required to identify a successful master data management strategy? Choose 3 answers:

A.
Identify data to be replicated
A.
Identify data to be replicated
Answers
B.
Create a data archive strategy
B.
Create a data archive strategy
Answers
C.
Define the systems of record for critical data
C.
Define the systems of record for critical data
Answers
D.
Install a data warehouse
D.
Install a data warehouse
Answers
E.
Choose a Business Intelligence tool.
E.
Choose a Business Intelligence tool.
Answers
Suggested answer: A, B, C

Explanation:

The three activities that will be required to identify a successful master data management strategy are:

Identify data to be replicated: This activity involves determining which data elements need to be copied from one system to another, and how frequently the replication should occur. This can help ensure data consistency and availability across systems.

Create a data archive strategy: This activity involves defining how historical data will be stored, accessed, and deleted over time. This can help optimize data storage, performance, and compliance.

Define the systems of record for critical data: This activity involves identifying which system owns and maintains the authoritative version of each data element.This can help avoid data conflicts and duplication across systems67.

Install a data warehouse is not a required activity, but rather a possible option for consolidating data from multiple sources for analytics purposes. Choose a Business Intelligence tool is not a required activity, but rather a possible option for visualizing and reporting on data from various sources.

UC is migrating data from legacy system to SF. UC would like to preserve the following information on records being migrated:

Date time stamps for created date and last modified date.

Ownership of records belonging to inactive users being migrated to Salesforce.

Which 2 solutions should a data architect recommends to preserve the date timestamps and ownership on records? Choose 2 answers.

A.
Log a case with SF to update these fields
A.
Log a case with SF to update these fields
Answers
B.
Enable update records with Inactive Owners Permission
B.
Enable update records with Inactive Owners Permission
Answers
C.
Enable Set Audit fields upon Record Creation Permission
C.
Enable Set Audit fields upon Record Creation Permission
Answers
D.
Enable modify all and view all permission.
D.
Enable modify all and view all permission.
Answers
Suggested answer: B, C

Explanation:

The two solutions that a data architect should recommend to preserve the date timestamps and ownership on records being migrated are:

Enable update records with Inactive Owners Permission: This permission allows users to update record owner and sharing-based records with inactive owners.This can help preserve the original ownership of records that belong to users who are no longer active in Salesforce8.

Enable Set Audit fields upon Record Creation Permission: This permission allows users to set audit fields (such as Created By or Last Modified By) when they create a record via API importing tools like Data Loader.This can help preserve the original date timestamps of records that were created or modified in another system9.

Log a case with SF to update these fields is not a good solution because it is not necessary or feasible to ask Salesforce support to update these fields manually or programmatically.Enable modify all and view all permission is not a good solution because it does not affect the ability to preserve the date timestamps and ownership on records, but rather grants users access to all records regardless of sharing settings

UC has migrated its Back-office data into an on-premise database with REST API access. UC recently implemented Sales cloud for its sales organization. But users are complaining about a lack of order data inside SF.

UC is concerned about SF storage limits but would still like Sales cloud to have access to the data.

Which design patterns should a data architect select to satisfy the requirement?

A.
Migrate and persist the data in SF to take advantage of native functionality.
A.
Migrate and persist the data in SF to take advantage of native functionality.
Answers
B.
Use SF Connect to virtualize the data in SF and avoid storage limits.
B.
Use SF Connect to virtualize the data in SF and avoid storage limits.
Answers
C.
Develop a bidirectional integration between the on-premise system and Salesforce.
C.
Develop a bidirectional integration between the on-premise system and Salesforce.
Answers
D.
Build a UI for the on-premise system and iframe it in Salesforce
D.
Build a UI for the on-premise system and iframe it in Salesforce
Answers
Suggested answer: B

Explanation:

The best design pattern to satisfy the requirement of accessing order data from an on-premise database with REST API access without consuming SF storage limits is to use SF Connect to virtualize the data in SF and avoid storage limits. SF Connect is an integration tool that allows users to access and integrate data from external sources using external objects. External objects are similar to custom objects, except that the data resides in another system and is accessed in real time via web service callouts.SF Connect supports various adapters to connect to different types of external data sources, such as OData, cross-org, or Apex custom adapter11. Migrate and persist the data in SF to take advantage of native functionality is not a good option because it would consume SF storage limits and require data synchronization between systems. Develop a bidirectional integration between the on-premise system and Salesforce is not a good option because it would be complex and costly to implement and maintain, and it would also consume SF storage limits. Build a UI for the on-premise system and iframe it in Salesforce is not a good option because it would not provide a seamless user experience and it would not allow users to search, report, or perform actions on the external data.

NTO has decided to franchise its brand. Upon implementation, 1000 franchisees will be able to access BTO's product information and track large customer sales and opportunities through a portal. The Franchisees will also be able to run monthly and quarterly sales reports and projections as well as view the reports in dashboards.

Which licenses does NTO need to provide these features to the Franchisees?

A.
Salesforce Sales Cloud license
A.
Salesforce Sales Cloud license
Answers
B.
Lightning Platform license
B.
Lightning Platform license
Answers
C.
Customer Community license
C.
Customer Community license
Answers
D.
Partner Community license
D.
Partner Community license
Answers
Suggested answer: D

Explanation:

The best license to provide these features to the franchisees is the Partner Community license. Partner Community licenses are designed for external users who collaborate with your sales team on deals, such as resellers, distributors, or brokers. Partner Community users can access standard CRM objects, such as accounts, contacts, leads, opportunities, campaigns, and reports.They can also access custom objects and run dashboards12. Salesforce Sales Cloud license is not a good option because it is intended for internal users who need full access to standard CRM and custom apps. Lightning Platform license is not a good option because it is intended for users who need access to custom apps but not to standard CRM functionality.Customer Community license is not a good option because it is intended for external users who need access to customer support features, such as cases and knowledge articles, but not to sales features

A customer needs a sales model that allows the following:

Opportunities need to be assigned to sales people based on the zip code.

Each sales person can be assigned to multiple zip codes.

Each zip code is assigned to a sales area definition. Sales is aggregated by sales area for reporting.

What should a data architect recommend?

A.
Assign opportunities using list views using zip code.
A.
Assign opportunities using list views using zip code.
Answers
B.
Add custom fields in opportunities for zip code and use assignment rules.
B.
Add custom fields in opportunities for zip code and use assignment rules.
Answers
C.
Allow sales users to manually assign opportunity ownership based on zip code.
C.
Allow sales users to manually assign opportunity ownership based on zip code.
Answers
D.
Configure territory management feature to support opportunity assignment.
D.
Configure territory management feature to support opportunity assignment.
Answers
Suggested answer: D

Explanation:

The best solution to assign opportunities based on zip code and sales area is to configure territory management feature to support opportunity assignment. Territory management is a feature that allows you to organize your sales team into territories based on criteria such as geography, industry, product line, or customer segment. You can assign accounts and opportunities to territories using assignment rules or manual sharing.You can also define forecast managers and roll up forecasts by territory45. Assign opportunities using list views using zip code is not a good solution because it is inefficient and does not support reporting by sales area. Add custom fields in opportunities for zip code and use assignment rules is not a good solution because it requires creating additional fields and does not support reporting by sales area. Allow sales users to manually assign opportunity ownership based on zip code is not a good solution because it is prone to errors and does not support reporting by sales area.

US is implementing salesforce and will be using salesforce to track customer complaints, provide white papers on products and provide subscription (Fee) -- based support.

Which license type will US users need to fulfil US's requirements?

A.
Lightning platform starter license.
A.
Lightning platform starter license.
Answers
B.
Service cloud license.
B.
Service cloud license.
Answers
C.
Salesforce license.
C.
Salesforce license.
Answers
D.
Sales cloud license
D.
Sales cloud license
Answers
Suggested answer: B

Explanation:

The best license type to fulfil US's requirements is the Service Cloud license. Service Cloud licenses are designed for users who need access to customer service features, such as cases, solutions, knowledge articles, entitlements, service contracts, and service console.Service Cloud users can also access standard CRM objects, such as accounts, contacts, leads, opportunities, campaigns, and reports3. Lightning Platform Starter license is not a good option because it is intended for users who need access to one custom app and a limited set of standard objects. Salesforce license is not a specific license type, but rather a generic term for any license that grants access to the Salesforce platform. Sales Cloud license is not a good option because it is intended for users who need access to sales features, such as products, price books, quotes, orders, and forecasts.

US has released a new disaster recovery (DR)policy that states that cloud solutions need a business continuity plan in place separate from the cloud providers built in data recovery solution.

Which solution should a data architect use to comply with the DR policy?

A.
Leverage a 3rd party tool that extract salesforce data/metadata and stores the information in an external protected system.
A.
Leverage a 3rd party tool that extract salesforce data/metadata and stores the information in an external protected system.
Answers
B.
Leverage salesforce weekly exports, and store data in Flat files on a protected system.
B.
Leverage salesforce weekly exports, and store data in Flat files on a protected system.
Answers
C.
Utilize an ETL tool to migrate data to an on-premise archive solution.
C.
Utilize an ETL tool to migrate data to an on-premise archive solution.
Answers
D.
Write a custom batch job to extract data changes nightly, and store in an external protected system.
D.
Write a custom batch job to extract data changes nightly, and store in an external protected system.
Answers
Suggested answer: A

Explanation:

The best solution to comply with the DR policy is to leverage a 3rd party tool that extract Salesforce data/metadata and stores the information in an external protected system. This solution can help create a backup of Salesforce data and metadata in case of a disaster or data loss event. It can also help restore data from the backup system to Salesforce if needed.There are various 3rd party tools available in the AppExchange or online that offer data backup and recovery services for Salesforce67. Leverage Salesforce weekly exports and store data in flat files on a protected system is not a good solution because it does not include metadata backup and it does not allow granular or automated data recovery. Utilize an ETL tool to migrate data to an on-premise archive solution is not a good solution because it does not include metadata backup and it may require complex data transformations and synchronizations. Write a custom batch job to extract data changes nightly and store in an external protected system is not a good solution because it does not include metadata backup and it may have performance or reliability issues.

Total 260 questions
Go to page: of 26