ExamGecko
Home Home / Salesforce / Certified Data Architect

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 13

Question list
Search
Search

List of questions

Search

Related questions











NTO (Northern Trail Outlets) has a complex Salesforce org which has been developed over past 5 years. Internal users are complaining abt multiple data issues, including incomplete and duplicate data in the org. NTO has decided to engage a data architect to analyze and define data quality standards.

Which 3 key factors should a data architect consider while defining data quality standards? Choose 3 answers:

A.
Define data duplication standards and rules
A.
Define data duplication standards and rules
Answers
B.
Define key fields in staging database for data cleansing
B.
Define key fields in staging database for data cleansing
Answers
C.
Measure data timeliness and consistency
C.
Measure data timeliness and consistency
Answers
D.
Finalize an extract transform load (ETL) tool for data migration
D.
Finalize an extract transform load (ETL) tool for data migration
Answers
E.
Measure data completeness and accuracy
E.
Measure data completeness and accuracy
Answers
Suggested answer: A, C, E

Explanation:

Defining data duplication standards and rules, measuring data timeliness and consistency, and measuring data completeness and accuracy are three key factors that a data architect should consider while defining data quality standards. Defining data duplication standards and rules can help prevent or reduce duplicate records in the org by specifying criteria and actions for identifying and merging duplicates. Measuring data timeliness and consistency can help ensure that the data is up-to-date, reliable, and synchronized across different sources. Measuring data completeness and accuracy can help ensure that the data is sufficient, relevant, and correct for the intended purposes.

Universal Containers (UC) requires 2 years of customer related cases to be available on SF for operational reporting. Any cases older than 2 years and upto 7 years need to be available on demand to the Service agents. UC creates 5 million cases per yr.

Which 2 data archiving strategies should a data architect recommend? Choose 2 options:

A.
Use custom objects for cases older than 2 years and use nightly batch to move them.
A.
Use custom objects for cases older than 2 years and use nightly batch to move them.
Answers
B.
Sync cases older than 2 years to an external database, and provide access to Service agents to the database
B.
Sync cases older than 2 years to an external database, and provide access to Service agents to the database
Answers
C.
Use Big objects for cases older than 2 years, and use nightly batch to move them.
C.
Use Big objects for cases older than 2 years, and use nightly batch to move them.
Answers
D.
Use Heroku and external objects to display cases older than 2 years and bulk API to hard delete from Salesforce.
D.
Use Heroku and external objects to display cases older than 2 years and bulk API to hard delete from Salesforce.
Answers
Suggested answer: C, D

Explanation:

The best data archiving strategies for UC are to use Big objects and Heroku with external objects. Big objects allow storing large amounts of data on the Salesforce platform without affecting performance or storage limits. They also support point-and-click tools, triggers, and Apex code. Heroku is a cloud platform that can host external databases and integrate with Salesforce using external objects. External objects enable on-demand access to external data sources via standard Salesforce APIs and user interfaces. Using bulk API to hard delete cases from Salesforce will free up storage space and improve performance.

NTO would like to retrieve their SF orgs meta data programmatically for backup within a various external. Which API is the best fit for accomplishing this task?

A.
Metadata API
A.
Metadata API
Answers
B.
Tooling API
B.
Tooling API
Answers
C.
Bulk API in serial mode
C.
Bulk API in serial mode
Answers
D.
SOAP API
D.
SOAP API
Answers
Suggested answer: A

Explanation:

The best API for retrieving Salesforce org metadata programmatically is the Metadata API. The Metadata API provides access to the metadata that defines the structure and configuration of an org, such as custom objects, fields, workflows, security settings, etc. It also supports deploying, retrieving, creating, updating, and deleting metadata components. The Metadata API can be used with various tools, such as Ant, Workbench, or IDEs.

A customer is operating in a highly reputated industry and is planning to implement SF. The customer information maintained in SF, includes the following:

Personally, identifiable information (PII)

IP restrictions on profiles organized by Geographic location

Financial records that need to be private and accessible only by the assigned Sales associate.

User should not be allowed to export information from Salesforce.

Enterprise security has mandate access to be restricted to users within a specific geography and detail monitoring of user activity. Which 3 Salesforce shield capabilities should a data architect recommend? Choose 3 answers:

A.
Event monitoring to monitor all user activities
A.
Event monitoring to monitor all user activities
Answers
B.
Restrict access to SF from users outside specific geography
B.
Restrict access to SF from users outside specific geography
Answers
C.
Prevent Sales users access to customer PII information
C.
Prevent Sales users access to customer PII information
Answers
D.
Transaction security policies to prevent export of SF Data.
D.
Transaction security policies to prevent export of SF Data.
Answers
E.
Encrypt Sensitive Customer information maintained in SF.
E.
Encrypt Sensitive Customer information maintained in SF.
Answers
Suggested answer: B, D, E

Explanation:

The best Salesforce Shield capabilities for the customer are to restrict access to SF from users outside specific geography, implement transaction security policies to prevent export of SF data, and encrypt sensitive customer information maintained in SF. Salesforce Shield is a set of security features that help protect enterprise data on the Salesforce platform. It includes three components: Event Monitoring, Platform Encryption, and Field Audit Trail. Restricting access to SF from users outside specific geography can be done using network-based security features, such as IP whitelisting or VPN. Transaction security policies can be used to define actions or notifications based on user behavior patterns, such as exporting data or logging in from an unknown device. Platform Encryption can be used to encrypt data at rest using a tenant secret key that is controlled by the customer.

NTO processes orders from its website via an order management system (OMS). The OMS stores over 2 million historical records and is currently not integrated with SF. The Sales team at NTO using Sales cloud and would like visibility into related customer orders yet they do not want to persist millions of records directly in Salesforce. NTO has asked the data architect to evaluate SF connect and the concept of data verification. Which 3 considerations are needed prior to a SF Connect implementation?

Choose 3 answers:

A.
Create a 2nd system Admin user for authentication to the external source.
A.
Create a 2nd system Admin user for authentication to the external source.
Answers
B.
Develop an object relationship strategy.
B.
Develop an object relationship strategy.
Answers
C.
Identify the external tables to sync into external objects
C.
Identify the external tables to sync into external objects
Answers
D.
Assess whether the external data source is reachable via an ODATA endpoint.
D.
Assess whether the external data source is reachable via an ODATA endpoint.
Answers
E.
Configure a middleware tool to poll external table data
E.
Configure a middleware tool to poll external table data
Answers
Suggested answer: B, C, D

Explanation:

The three considerations needed prior to a SF Connect implementation are to develop an object relationship strategy, identify the external tables to sync into external objects, and assess whether the external data source is reachable via an ODATA endpoint. SF Connect is a feature that allows integrating external data sources with Salesforce using external objects. External objects are similar to custom objects, but they store metadata only and not data. They enable on-demand access to external data via standard Salesforce APIs and user interfaces. To implement SF Connect, a data architect needs to consider how the external objects will relate to other objects in Salesforce, which external tables will be exposed as external objects, and whether the external data source supports ODATA protocol for data access.

UC has been using SF for 10 years. Lately, users have noticed, that the pages load slowly when viewing Customer and Account list view.

To mitigate, UC will implement a data archive strategy to reduce the amount of data actively loaded.

Which 2 tasks are required to define the strategy? Choose 2 answers:

A.
Identify the recovery point objective.
A.
Identify the recovery point objective.
Answers
B.
Identify how the archive data will be accessed and used.
B.
Identify how the archive data will be accessed and used.
Answers
C.
Identify the recovery time objective.
C.
Identify the recovery time objective.
Answers
D.
Identify the data retention requirements
D.
Identify the data retention requirements
Answers
Suggested answer: B, D

Explanation:

The two tasks required to define the data archive strategy are to identify how the archive data will be accessed and used, and identify the data retention requirements. Data archiving is the process of moving infrequently used or historical data from active storage to a separate storage location for long-term retention. Data archiving can improve performance, reduce storage costs, and comply with legal or regulatory obligations. To define a data archive strategy, a data architect needs to consider how the archived data will be accessed and used by different users or processes in Salesforce or outside Salesforce, and how long the archived data needs to be retained based on business or legal requirements.

UC has a classic encryption for Custom fields and is leveraging weekly data reports for data backups. During the data validation of exported data UC discovered that encrypted field values are still being exported as part of data exported. What should a data architect recommend to make sure decrypted values are exported during data export?

A.
Set a standard profile for Data Migration user, and assign view encrypted data
A.
Set a standard profile for Data Migration user, and assign view encrypted data
Answers
B.
Create another field to copy data from encrypted field and use this field in export
B.
Create another field to copy data from encrypted field and use this field in export
Answers
C.
Leverage Apex class to decrypt data before exporting it.
C.
Leverage Apex class to decrypt data before exporting it.
Answers
D.
Set up a custom profile for data migration user and assign view encrypted data.
D.
Set up a custom profile for data migration user and assign view encrypted data.
Answers
Suggested answer: B

Explanation:

The best solution to make sure decrypted values are exported during data export is to create another field to copy data from encrypted field and use this field in export. This is because classic encryption does not support exporting decrypted values of encrypted fields. The view encrypted data permission only allows users to view decrypted values in the user interface, but not in reports or data exports. Therefore, a workaround is to create a formula field or a workflow field update that copies the value of the encrypted field to another field, and use that field for data export. However, this solution has some drawbacks, such as exposing sensitive data in plain text and consuming extra storage space.A better solution would be to use Shield Platform Encryption, which supports exporting decrypted values of encrypted fields with the Export Encrypted Data permission

Universal containers is implementing Salesforce lead management. UC Procure lead data from multiple sources and would like to make sure lead data as company profile and location information. Which solution should a data architect recommend to make sure lead datahas both profile and location information? Option

A.
Ask sales people to search for populating company profile and location data
A.
Ask sales people to search for populating company profile and location data
Answers
B.
Run reports to identify records which does not have company profile and location data
B.
Run reports to identify records which does not have company profile and location data
Answers
C.
Leverage external data providers populate company profile and location data
C.
Leverage external data providers populate company profile and location data
Answers
D.
Export data out of Salesforce and send to another team to populate company profile and location data
D.
Export data out of Salesforce and send to another team to populate company profile and location data
Answers
Suggested answer: C

Explanation:

The best solution to make sure lead data has both profile and location information is to leverage external data providers to populate company profile and location data. This is because external data providers can enrich lead data with additional information from third-party sources, such as Dun & Bradstreet, ZoomInfo, or Clearbit. This can help improve lead quality, segmentation, and conversion.Salesforce supports integrating with external data providers using Data.com Clean or other AppExchange solutions2. Asking sales people to search for populating company profile and location data is inefficient and prone to errors. Running reports to identify records which do not have company profile and location data is useful, but does not solve the problem of how to populate the missing data. Exporting data out of Salesforce and sending to another team to populate company profile and location data is cumbersome and time-consuming.

UC has millions of case records with case history and SLA dat

a. UC's compliance team would like historical cases to be accessible for 10 years for Audit purpose.

What solution should a data architect recommend?

A.
Archive Case data using Salesforce Archiving process
A.
Archive Case data using Salesforce Archiving process
Answers
B.
Purchase more data storage to support case object
B.
Purchase more data storage to support case object
Answers
C.
Use a custom object to store archived case data.
C.
Use a custom object to store archived case data.
Answers
D.
Use a custom Big object to store archived case data.
D.
Use a custom Big object to store archived case data.
Answers
Suggested answer: D

Explanation:

The best solution to store historical cases for 10 years for audit purpose is to use a custom Big object to store archived case data. Big objects are a type of custom object that can store massive amounts of data on the Salesforce platform without affecting performance or storage limits. They also support point-and-click tools, triggers, and Apex code.Big objects can be used for archiving historical data that needs to be retained for compliance or analytics purposes3.Archiving case data using Salesforce Archiving process is not a good option because it only supports archiving cases that are closed for more than one year, and it does not allow customizing the archival criteria or accessing the archived data via Apex or APIs4. Purchasing more data storage to support case object is expensive and may impact performance. Using a custom object to store archived case data is not scalable and may consume a lot of storage space.

NTO need to extract 50 million records from a custom object everyday from its Salesforce org. NTO is facing query timeout issues while extracting these records.

What should a data architect recommend in order to get around the time out issue?

A.
Use a custom auto number and formula field and use that to chunk records while extracting data.
A.
Use a custom auto number and formula field and use that to chunk records while extracting data.
Answers
B.
The REST API to extract data as it automatically chunks records by 200.
B.
The REST API to extract data as it automatically chunks records by 200.
Answers
C.
Use ETL tool for extraction of records.
C.
Use ETL tool for extraction of records.
Answers
D.
Ask SF support to increase the query timeout value.
D.
Ask SF support to increase the query timeout value.
Answers
Suggested answer: C

Explanation:

The best solution to extract 50 million records from a custom object everyday from Salesforce org without facing query timeout issues is to use an ETL tool for extraction of records. ETL stands for extract, transform, and load, and it refers to a process of moving data from one system to another. An ETL tool is a software application that can connect to various data sources, perform data transformations, and load data into a target destination.ETL tools can handle large volumes of data efficiently and reliably, and they often provide features such as scheduling, monitoring, error handling, and logging5. Using a custom auto number and formula field and use that to chunk records while extracting data is a possible workaround, but it requires creating additional fields and writing complex queries.The REST API can extract data as it automatically chunks records by 200, but it has some limitations, such as a maximum of 50 million records per query job6.Asking SF support to increase the query timeout value is not feasible because query timeout values are not configurable

Total 260 questions
Go to page: of 26