Salesforce Certified Data Architect Practice Test - Questions Answers, Page 4
List of questions
Question 31
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
How can an architect find information about who is creating, changing, or deleting certain fields within the past two months?
Explanation:
The setup audit trail tracks changes made in your org's setup area for the past 180 days1.You can export the setup audit trail and find the fields in question by filtering by action type, user, or date2.
Question 32
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
Universal Containers wants to automatically archive all inactive Account data that is older than 3 years. The information does not need to remain accessible within the application. Which two methods should be recommended to meet this requirement? Choose 2 answers
Explanation:
Both C and D are valid methods to automatically archive and delete inactive Account data that is older than 3 years1.You can use an ETL tool or the Data Loader to schedule jobs to export and delete data based on certain criteria3. Option A is not recommended because the Force.com Workbench is a web-based tool that does not support scheduling or automation. Option B is not suitable because the weekly export file from the Salesforce UI does not delete data from Salesforce.
Question 33
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
Cloud Kicks needs to purge detailed transactional records from Salesforce. The data should be aggregated at a summary level and available in Salesforce.
What are two automated approaches to fulfill this goal? (Choose two.)
Explanation:
Both A and B are automated approaches to purge detailed transactional records from Salesforce and aggregate them at a summary level1. You can use a third-party integration tool (ETL) or schedulable batch Apex to perform these tasks. Option C is not correct because a third-party business intelligence system does not purge data from Salesforce, but only analyzes it. Option D is not correct because Apex triggers are not automated, but execute when a record is inserted, updated, deleted, or undeleted.
Question 34
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
Universal Containers (UC) is concerned that data is being corrupted daily either through negligence or maliciousness. They want to implement a backup strategy to help recover any corrupted data or data mistakenly changed or even deleted. What should the data architect consider when designing a field -level audit and recovery plan?
Explanation:
Option C is the best answer because reviewing projected data storage needs is an important step in designing a field-level audit and recovery plan1. You need to estimate how much data storage you will need in the future and plan accordingly. Option A is not correct because reducing data storage by purging old data may not be sufficient or desirable for backup purposes. Option B is not correct because implementing an AppExchange package may not be customized or compatible with your org's requirements. Option D is not correct because scheduling a weekly export file may not be frequent or granular enough for field-level audit and recovery.
Question 35
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
Every year, Ursa Major Solar has more than 1 million orders. Each order contains an average of 10 line items. The Chief Executive Officer (CEO) needs the Sales Reps to see how much money each customer generates year-over-year. However, data storage is running low in Salesforce.
Which approach for data archiving is appropriate for this scenario?
Explanation:
Option B is the most appropriate approach for data archiving in this scenario1. By aggregating order amount data at a summary level and storing it in a custom object, you can reduce data storage and still provide visibility to the Sales Reps on how much money each customer generates year-over-year. Option A is not correct because storing data in a zip file does not make it available in Salesforce. Option C is not correct because exporting and deleting orders and order line items may lose important details that are needed for analysis or reporting. Option D is not correct because deleting orders and order line items without exporting them may cause data loss or inconsistency if the customer does not have order information in another system.
Question 36
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
Get Cloudy Consulting monitors 15,000 servers, and these servers automatically record their status every 10 minutes. Because of company policy, these status reports must be maintained for 5 years. Managers at Get Cloudy Consulting need access to up to one week's worth of these status reports with all of their details.
An Architect is recommending what data should be integrated into Salesforce and for how long it should be stored in Salesforce.
Which two limits should the Architect be aware of? (Choose two.)
Explanation:
Data storage limits and API request limits are two important factors that affect the data integration and storage in Salesforce. Data storage limits determine how much data can be stored in Salesforce, and API request limits determine how many API calls can be made to Salesforce in a 24-hour period. Both of these limits depend on the edition and license type of the Salesforce org. Workflow rule limits and webservice callout limits are not directly related to data integration and storage, but rather to business logic and external services.
Question 37
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A Salesforce customer has plenty of data storage. Sales Reps are complaining that searches are bringing back old records that aren't relevant any longer. Sales Managers need the data for their historical reporting. What strategy should a data architect use to ensure a better user experience for the Sales Reps?
Explanation:
Archiving and purging old data from Salesforce on a monthly basis is a good strategy to improve the user experience for the Sales Reps, as it will reduce the clutter and improve the search performance. Creating a permission set or setting data access to private are not effective ways to hide old data from Sales Reps, as they will still consume data storage and affect search results. Using Batch Apex to archive old data on a rolling nightly basis is also not a good option, as it will consume API requests and processing time, and may not comply with the data retention policy.
Question 38
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
Universal Containers (UC) is implementing a formal, cross -business -unit data governance program As part of the program, UC will implement a team to make decisions on enterprise -wide data governance. Which two roles are appropriate as members of this team? Choose 2 answers
Explanation:
Analytics/BI Owners and Data Domain Stewards are appropriate roles as members of a team that makes decisions on enterprise-wide data governance. Analytics/BI Owners are responsible for defining the business requirements and metrics for data analysis and reporting, and Data Domain Stewards are responsible for defining and enforcing the data quality standards and rules for specific data domains. Salesforce Administrators and Operational Data Users are not suitable roles for this team, as they are more focused on the operational aspects of data management, such as configuration, maintenance, and usage.
Question 39
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
Universal Containers (UC) has a complex system landscape and is implementing a data governance program for the first time Which two first steps would be appropriate for UC to initiate an assessment of data architecture? Choose 2 answers
Explanation:
Engaging with executive sponsorship to assess enterprise data strategy and goals, and engaging with business units and IT to assess current operational systems and data models are two first steps that would be appropriate for UC to initiate an assessment of data architecture. These steps will help to understand the current state of data management, the business needs and expectations, and the gaps and opportunities for improvement. Engaging with IT program managers to assess current velocity of projects in the pipeline, and engaging with database administrators to assess current database performance metrics are not relevant steps for assessing data architecture, as they are more related to project management and technical performance.
Question 40
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A data architect has been tasked with optimizing a data stewardship engagement for a Salesforce instance Which three areas of Salesforce should the architect review before proposing any design recommendation? Choose 3 answers
Explanation:
Determining if any integration points create records in Salesforce, running key reports to determine what fields should be required, and reviewing the sharing model to determine impact on duplicate records are three areas of Salesforce that the architect should review before proposing any design recommendation. These areas will help to identify the sources and quality of data, the business rules and validations for data entry, and the access and visibility of data across users and roles. Reviewing the metadata xml files for redundant fields to consolidate, and exporting the setup audit trail to review what fields are being used are not necessary steps for optimizing a data stewardship engagement, as they are more related to metadata management and audit tracking.
Question