Salesforce Certified Data Architect Practice Test - Questions Answers, Page 13
List of questions
Question 121

NTO (Northern Trail Outlets) has a complex Salesforce org which has been developed over past 5 years. Internal users are complaining abt multiple data issues, including incomplete and duplicate data in the org. NTO has decided to engage a data architect to analyze and define data quality standards.
Which 3 key factors should a data architect consider while defining data quality standards? Choose 3 answers:
Explanation:
Defining data duplication standards and rules, measuring data timeliness and consistency, and measuring data completeness and accuracy are three key factors that a data architect should consider while defining data quality standards. Defining data duplication standards and rules can help prevent or reduce duplicate records in the org by specifying criteria and actions for identifying and merging duplicates. Measuring data timeliness and consistency can help ensure that the data is up-to-date, reliable, and synchronized across different sources. Measuring data completeness and accuracy can help ensure that the data is sufficient, relevant, and correct for the intended purposes.
Question 122

Universal Containers (UC) requires 2 years of customer related cases to be available on SF for operational reporting. Any cases older than 2 years and upto 7 years need to be available on demand to the Service agents. UC creates 5 million cases per yr.
Which 2 data archiving strategies should a data architect recommend? Choose 2 options:
Explanation:
The best data archiving strategies for UC are to use Big objects and Heroku with external objects. Big objects allow storing large amounts of data on the Salesforce platform without affecting performance or storage limits. They also support point-and-click tools, triggers, and Apex code. Heroku is a cloud platform that can host external databases and integrate with Salesforce using external objects. External objects enable on-demand access to external data sources via standard Salesforce APIs and user interfaces. Using bulk API to hard delete cases from Salesforce will free up storage space and improve performance.
Question 123

NTO would like to retrieve their SF orgs meta data programmatically for backup within a various external. Which API is the best fit for accomplishing this task?
Explanation:
The best API for retrieving Salesforce org metadata programmatically is the Metadata API. The Metadata API provides access to the metadata that defines the structure and configuration of an org, such as custom objects, fields, workflows, security settings, etc. It also supports deploying, retrieving, creating, updating, and deleting metadata components. The Metadata API can be used with various tools, such as Ant, Workbench, or IDEs.
Question 124

A customer is operating in a highly reputated industry and is planning to implement SF. The customer information maintained in SF, includes the following:
Personally, identifiable information (PII)
IP restrictions on profiles organized by Geographic location
Financial records that need to be private and accessible only by the assigned Sales associate.
User should not be allowed to export information from Salesforce.
Enterprise security has mandate access to be restricted to users within a specific geography and detail monitoring of user activity. Which 3 Salesforce shield capabilities should a data architect recommend? Choose 3 answers:
Explanation:
The best Salesforce Shield capabilities for the customer are to restrict access to SF from users outside specific geography, implement transaction security policies to prevent export of SF data, and encrypt sensitive customer information maintained in SF. Salesforce Shield is a set of security features that help protect enterprise data on the Salesforce platform. It includes three components: Event Monitoring, Platform Encryption, and Field Audit Trail. Restricting access to SF from users outside specific geography can be done using network-based security features, such as IP whitelisting or VPN. Transaction security policies can be used to define actions or notifications based on user behavior patterns, such as exporting data or logging in from an unknown device. Platform Encryption can be used to encrypt data at rest using a tenant secret key that is controlled by the customer.
Question 125

NTO processes orders from its website via an order management system (OMS). The OMS stores over 2 million historical records and is currently not integrated with SF. The Sales team at NTO using Sales cloud and would like visibility into related customer orders yet they do not want to persist millions of records directly in Salesforce. NTO has asked the data architect to evaluate SF connect and the concept of data verification. Which 3 considerations are needed prior to a SF Connect implementation?
Choose 3 answers:
Explanation:
The three considerations needed prior to a SF Connect implementation are to develop an object relationship strategy, identify the external tables to sync into external objects, and assess whether the external data source is reachable via an ODATA endpoint. SF Connect is a feature that allows integrating external data sources with Salesforce using external objects. External objects are similar to custom objects, but they store metadata only and not data. They enable on-demand access to external data via standard Salesforce APIs and user interfaces. To implement SF Connect, a data architect needs to consider how the external objects will relate to other objects in Salesforce, which external tables will be exposed as external objects, and whether the external data source supports ODATA protocol for data access.
Question 126

UC has been using SF for 10 years. Lately, users have noticed, that the pages load slowly when viewing Customer and Account list view.
To mitigate, UC will implement a data archive strategy to reduce the amount of data actively loaded.
Which 2 tasks are required to define the strategy? Choose 2 answers:
Explanation:
The two tasks required to define the data archive strategy are to identify how the archive data will be accessed and used, and identify the data retention requirements. Data archiving is the process of moving infrequently used or historical data from active storage to a separate storage location for long-term retention. Data archiving can improve performance, reduce storage costs, and comply with legal or regulatory obligations. To define a data archive strategy, a data architect needs to consider how the archived data will be accessed and used by different users or processes in Salesforce or outside Salesforce, and how long the archived data needs to be retained based on business or legal requirements.
Question 127

UC has a classic encryption for Custom fields and is leveraging weekly data reports for data backups. During the data validation of exported data UC discovered that encrypted field values are still being exported as part of data exported. What should a data architect recommend to make sure decrypted values are exported during data export?
Explanation:
The best solution to make sure decrypted values are exported during data export is to create another field to copy data from encrypted field and use this field in export. This is because classic encryption does not support exporting decrypted values of encrypted fields. The view encrypted data permission only allows users to view decrypted values in the user interface, but not in reports or data exports. Therefore, a workaround is to create a formula field or a workflow field update that copies the value of the encrypted field to another field, and use that field for data export. However, this solution has some drawbacks, such as exposing sensitive data in plain text and consuming extra storage space.A better solution would be to use Shield Platform Encryption, which supports exporting decrypted values of encrypted fields with the Export Encrypted Data permission
Question 128

Universal containers is implementing Salesforce lead management. UC Procure lead data from multiple sources and would like to make sure lead data as company profile and location information. Which solution should a data architect recommend to make sure lead datahas both profile and location information? Option
Explanation:
The best solution to make sure lead data has both profile and location information is to leverage external data providers to populate company profile and location data. This is because external data providers can enrich lead data with additional information from third-party sources, such as Dun & Bradstreet, ZoomInfo, or Clearbit. This can help improve lead quality, segmentation, and conversion.Salesforce supports integrating with external data providers using Data.com Clean or other AppExchange solutions2. Asking sales people to search for populating company profile and location data is inefficient and prone to errors. Running reports to identify records which do not have company profile and location data is useful, but does not solve the problem of how to populate the missing data. Exporting data out of Salesforce and sending to another team to populate company profile and location data is cumbersome and time-consuming.
Question 129

UC has millions of case records with case history and SLA dat
a. UC's compliance team would like historical cases to be accessible for 10 years for Audit purpose.
What solution should a data architect recommend?
Explanation:
The best solution to store historical cases for 10 years for audit purpose is to use a custom Big object to store archived case data. Big objects are a type of custom object that can store massive amounts of data on the Salesforce platform without affecting performance or storage limits. They also support point-and-click tools, triggers, and Apex code.Big objects can be used for archiving historical data that needs to be retained for compliance or analytics purposes3.Archiving case data using Salesforce Archiving process is not a good option because it only supports archiving cases that are closed for more than one year, and it does not allow customizing the archival criteria or accessing the archived data via Apex or APIs4. Purchasing more data storage to support case object is expensive and may impact performance. Using a custom object to store archived case data is not scalable and may consume a lot of storage space.
Question 130

NTO need to extract 50 million records from a custom object everyday from its Salesforce org. NTO is facing query timeout issues while extracting these records.
What should a data architect recommend in order to get around the time out issue?
Explanation:
The best solution to extract 50 million records from a custom object everyday from Salesforce org without facing query timeout issues is to use an ETL tool for extraction of records. ETL stands for extract, transform, and load, and it refers to a process of moving data from one system to another. An ETL tool is a software application that can connect to various data sources, perform data transformations, and load data into a target destination.ETL tools can handle large volumes of data efficiently and reliably, and they often provide features such as scheduling, monitoring, error handling, and logging5. Using a custom auto number and formula field and use that to chunk records while extracting data is a possible workaround, but it requires creating additional fields and writing complex queries.The REST API can extract data as it automatically chunks records by 200, but it has some limitations, such as a maximum of 50 million records per query job6.Asking SF support to increase the query timeout value is not feasible because query timeout values are not configurable
Question