ExamGecko
Home / Salesforce / Certified Data Architect / List of questions
Ask Question

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 13

List of questions

Question 121

Report
Export
Collapse

NTO (Northern Trail Outlets) has a complex Salesforce org which has been developed over past 5 years. Internal users are complaining abt multiple data issues, including incomplete and duplicate data in the org. NTO has decided to engage a data architect to analyze and define data quality standards.

Which 3 key factors should a data architect consider while defining data quality standards? Choose 3 answers:

Define data duplication standards and rules
Define data duplication standards and rules
Define key fields in staging database for data cleansing
Define key fields in staging database for data cleansing
Measure data timeliness and consistency
Measure data timeliness and consistency
Finalize an extract transform load (ETL) tool for data migration
Finalize an extract transform load (ETL) tool for data migration
Measure data completeness and accuracy
Measure data completeness and accuracy
Suggested answer: A, C, E

Explanation:

Defining data duplication standards and rules, measuring data timeliness and consistency, and measuring data completeness and accuracy are three key factors that a data architect should consider while defining data quality standards. Defining data duplication standards and rules can help prevent or reduce duplicate records in the org by specifying criteria and actions for identifying and merging duplicates. Measuring data timeliness and consistency can help ensure that the data is up-to-date, reliable, and synchronized across different sources. Measuring data completeness and accuracy can help ensure that the data is sufficient, relevant, and correct for the intended purposes.

asked 23/09/2024
alain giansily
40 questions

Question 122

Report
Export
Collapse

Universal Containers (UC) requires 2 years of customer related cases to be available on SF for operational reporting. Any cases older than 2 years and upto 7 years need to be available on demand to the Service agents. UC creates 5 million cases per yr.

Which 2 data archiving strategies should a data architect recommend? Choose 2 options:

Use custom objects for cases older than 2 years and use nightly batch to move them.
Use custom objects for cases older than 2 years and use nightly batch to move them.
Sync cases older than 2 years to an external database, and provide access to Service agents to the database
Sync cases older than 2 years to an external database, and provide access to Service agents to the database
Use Big objects for cases older than 2 years, and use nightly batch to move them.
Use Big objects for cases older than 2 years, and use nightly batch to move them.
Use Heroku and external objects to display cases older than 2 years and bulk API to hard delete from Salesforce.
Use Heroku and external objects to display cases older than 2 years and bulk API to hard delete from Salesforce.
Suggested answer: C, D

Explanation:

The best data archiving strategies for UC are to use Big objects and Heroku with external objects. Big objects allow storing large amounts of data on the Salesforce platform without affecting performance or storage limits. They also support point-and-click tools, triggers, and Apex code. Heroku is a cloud platform that can host external databases and integrate with Salesforce using external objects. External objects enable on-demand access to external data sources via standard Salesforce APIs and user interfaces. Using bulk API to hard delete cases from Salesforce will free up storage space and improve performance.

asked 23/09/2024
claudine Nguepnang
45 questions

Question 123

Report
Export
Collapse

NTO would like to retrieve their SF orgs meta data programmatically for backup within a various external. Which API is the best fit for accomplishing this task?

Metadata API
Metadata API
Tooling API
Tooling API
Bulk API in serial mode
Bulk API in serial mode
SOAP API
SOAP API
Suggested answer: A

Explanation:

The best API for retrieving Salesforce org metadata programmatically is the Metadata API. The Metadata API provides access to the metadata that defines the structure and configuration of an org, such as custom objects, fields, workflows, security settings, etc. It also supports deploying, retrieving, creating, updating, and deleting metadata components. The Metadata API can be used with various tools, such as Ant, Workbench, or IDEs.

asked 23/09/2024
P. Kriek
40 questions

Question 124

Report
Export
Collapse

A customer is operating in a highly reputated industry and is planning to implement SF. The customer information maintained in SF, includes the following:

Personally, identifiable information (PII)

IP restrictions on profiles organized by Geographic location

Financial records that need to be private and accessible only by the assigned Sales associate.

User should not be allowed to export information from Salesforce.

Enterprise security has mandate access to be restricted to users within a specific geography and detail monitoring of user activity. Which 3 Salesforce shield capabilities should a data architect recommend? Choose 3 answers:

Event monitoring to monitor all user activities
Event monitoring to monitor all user activities
Restrict access to SF from users outside specific geography
Restrict access to SF from users outside specific geography
Prevent Sales users access to customer PII information
Prevent Sales users access to customer PII information
Transaction security policies to prevent export of SF Data.
Transaction security policies to prevent export of SF Data.
Encrypt Sensitive Customer information maintained in SF.
Encrypt Sensitive Customer information maintained in SF.
Suggested answer: B, D, E

Explanation:

The best Salesforce Shield capabilities for the customer are to restrict access to SF from users outside specific geography, implement transaction security policies to prevent export of SF data, and encrypt sensitive customer information maintained in SF. Salesforce Shield is a set of security features that help protect enterprise data on the Salesforce platform. It includes three components: Event Monitoring, Platform Encryption, and Field Audit Trail. Restricting access to SF from users outside specific geography can be done using network-based security features, such as IP whitelisting or VPN. Transaction security policies can be used to define actions or notifications based on user behavior patterns, such as exporting data or logging in from an unknown device. Platform Encryption can be used to encrypt data at rest using a tenant secret key that is controlled by the customer.

asked 23/09/2024
justen layne
37 questions

Question 125

Report
Export
Collapse

NTO processes orders from its website via an order management system (OMS). The OMS stores over 2 million historical records and is currently not integrated with SF. The Sales team at NTO using Sales cloud and would like visibility into related customer orders yet they do not want to persist millions of records directly in Salesforce. NTO has asked the data architect to evaluate SF connect and the concept of data verification. Which 3 considerations are needed prior to a SF Connect implementation?

Choose 3 answers:

Create a 2nd system Admin user for authentication to the external source.
Create a 2nd system Admin user for authentication to the external source.
Develop an object relationship strategy.
Develop an object relationship strategy.
Identify the external tables to sync into external objects
Identify the external tables to sync into external objects
Assess whether the external data source is reachable via an ODATA endpoint.
Assess whether the external data source is reachable via an ODATA endpoint.
Configure a middleware tool to poll external table data
Configure a middleware tool to poll external table data
Suggested answer: B, C, D

Explanation:

The three considerations needed prior to a SF Connect implementation are to develop an object relationship strategy, identify the external tables to sync into external objects, and assess whether the external data source is reachable via an ODATA endpoint. SF Connect is a feature that allows integrating external data sources with Salesforce using external objects. External objects are similar to custom objects, but they store metadata only and not data. They enable on-demand access to external data via standard Salesforce APIs and user interfaces. To implement SF Connect, a data architect needs to consider how the external objects will relate to other objects in Salesforce, which external tables will be exposed as external objects, and whether the external data source supports ODATA protocol for data access.

asked 23/09/2024
Ivan Galir
47 questions

Question 126

Report
Export
Collapse

UC has been using SF for 10 years. Lately, users have noticed, that the pages load slowly when viewing Customer and Account list view.

To mitigate, UC will implement a data archive strategy to reduce the amount of data actively loaded.

Which 2 tasks are required to define the strategy? Choose 2 answers:

Identify the recovery point objective.
Identify the recovery point objective.
Identify how the archive data will be accessed and used.
Identify how the archive data will be accessed and used.
Identify the recovery time objective.
Identify the recovery time objective.
Identify the data retention requirements
Identify the data retention requirements
Suggested answer: B, D

Explanation:

The two tasks required to define the data archive strategy are to identify how the archive data will be accessed and used, and identify the data retention requirements. Data archiving is the process of moving infrequently used or historical data from active storage to a separate storage location for long-term retention. Data archiving can improve performance, reduce storage costs, and comply with legal or regulatory obligations. To define a data archive strategy, a data architect needs to consider how the archived data will be accessed and used by different users or processes in Salesforce or outside Salesforce, and how long the archived data needs to be retained based on business or legal requirements.

asked 23/09/2024
Reneus Martini
33 questions

Question 127

Report
Export
Collapse

UC has a classic encryption for Custom fields and is leveraging weekly data reports for data backups. During the data validation of exported data UC discovered that encrypted field values are still being exported as part of data exported. What should a data architect recommend to make sure decrypted values are exported during data export?

Set a standard profile for Data Migration user, and assign view encrypted data
Set a standard profile for Data Migration user, and assign view encrypted data
Create another field to copy data from encrypted field and use this field in export
Create another field to copy data from encrypted field and use this field in export
Leverage Apex class to decrypt data before exporting it.
Leverage Apex class to decrypt data before exporting it.
Set up a custom profile for data migration user and assign view encrypted data.
Set up a custom profile for data migration user and assign view encrypted data.
Suggested answer: B

Explanation:

The best solution to make sure decrypted values are exported during data export is to create another field to copy data from encrypted field and use this field in export. This is because classic encryption does not support exporting decrypted values of encrypted fields. The view encrypted data permission only allows users to view decrypted values in the user interface, but not in reports or data exports. Therefore, a workaround is to create a formula field or a workflow field update that copies the value of the encrypted field to another field, and use that field for data export. However, this solution has some drawbacks, such as exposing sensitive data in plain text and consuming extra storage space.A better solution would be to use Shield Platform Encryption, which supports exporting decrypted values of encrypted fields with the Export Encrypted Data permission

asked 23/09/2024
Delano van Kleinwee
40 questions

Question 128

Report
Export
Collapse

Universal containers is implementing Salesforce lead management. UC Procure lead data from multiple sources and would like to make sure lead data as company profile and location information. Which solution should a data architect recommend to make sure lead datahas both profile and location information? Option

Ask sales people to search for populating company profile and location data
Ask sales people to search for populating company profile and location data
Run reports to identify records which does not have company profile and location data
Run reports to identify records which does not have company profile and location data
Leverage external data providers populate company profile and location data
Leverage external data providers populate company profile and location data
Export data out of Salesforce and send to another team to populate company profile and location data
Export data out of Salesforce and send to another team to populate company profile and location data
Suggested answer: C

Explanation:

The best solution to make sure lead data has both profile and location information is to leverage external data providers to populate company profile and location data. This is because external data providers can enrich lead data with additional information from third-party sources, such as Dun & Bradstreet, ZoomInfo, or Clearbit. This can help improve lead quality, segmentation, and conversion.Salesforce supports integrating with external data providers using Data.com Clean or other AppExchange solutions2. Asking sales people to search for populating company profile and location data is inefficient and prone to errors. Running reports to identify records which do not have company profile and location data is useful, but does not solve the problem of how to populate the missing data. Exporting data out of Salesforce and sending to another team to populate company profile and location data is cumbersome and time-consuming.

asked 23/09/2024
Steven Chong
34 questions

Question 129

Report
Export
Collapse

UC has millions of case records with case history and SLA dat

a. UC's compliance team would like historical cases to be accessible for 10 years for Audit purpose.

What solution should a data architect recommend?

Archive Case data using Salesforce Archiving process
Archive Case data using Salesforce Archiving process
Purchase more data storage to support case object
Purchase more data storage to support case object
Use a custom object to store archived case data.
Use a custom object to store archived case data.
Use a custom Big object to store archived case data.
Use a custom Big object to store archived case data.
Suggested answer: D

Explanation:

The best solution to store historical cases for 10 years for audit purpose is to use a custom Big object to store archived case data. Big objects are a type of custom object that can store massive amounts of data on the Salesforce platform without affecting performance or storage limits. They also support point-and-click tools, triggers, and Apex code.Big objects can be used for archiving historical data that needs to be retained for compliance or analytics purposes3.Archiving case data using Salesforce Archiving process is not a good option because it only supports archiving cases that are closed for more than one year, and it does not allow customizing the archival criteria or accessing the archived data via Apex or APIs4. Purchasing more data storage to support case object is expensive and may impact performance. Using a custom object to store archived case data is not scalable and may consume a lot of storage space.

asked 23/09/2024
Novka Mandic
35 questions

Question 130

Report
Export
Collapse

NTO need to extract 50 million records from a custom object everyday from its Salesforce org. NTO is facing query timeout issues while extracting these records.

What should a data architect recommend in order to get around the time out issue?

Use a custom auto number and formula field and use that to chunk records while extracting data.
Use a custom auto number and formula field and use that to chunk records while extracting data.
The REST API to extract data as it automatically chunks records by 200.
The REST API to extract data as it automatically chunks records by 200.
Use ETL tool for extraction of records.
Use ETL tool for extraction of records.
Ask SF support to increase the query timeout value.
Ask SF support to increase the query timeout value.
Suggested answer: C

Explanation:

The best solution to extract 50 million records from a custom object everyday from Salesforce org without facing query timeout issues is to use an ETL tool for extraction of records. ETL stands for extract, transform, and load, and it refers to a process of moving data from one system to another. An ETL tool is a software application that can connect to various data sources, perform data transformations, and load data into a target destination.ETL tools can handle large volumes of data efficiently and reliably, and they often provide features such as scheduling, monitoring, error handling, and logging5. Using a custom auto number and formula field and use that to chunk records while extracting data is a possible workaround, but it requires creating additional fields and writing complex queries.The REST API can extract data as it automatically chunks records by 200, but it has some limitations, such as a maximum of 50 million records per query job6.Asking SF support to increase the query timeout value is not feasible because query timeout values are not configurable

asked 23/09/2024
Tudy smith
22 questions
Total 260 questions
Go to page: of 26
Search

Related questions