ExamGecko
Home / Salesforce / Certified Data Architect / List of questions
Ask Question

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 4

List of questions

Question 31

Report
Export
Collapse

How can an architect find information about who is creating, changing, or deleting certain fields within the past two months?

Remove 'customize application' permissions from everyone else.
Remove 'customize application' permissions from everyone else.
Export the metadata and search it for the fields in question.
Export the metadata and search it for the fields in question.
Create a field history report for the fields in question.
Create a field history report for the fields in question.
Export the setup audit trail and find the fields in question.
Export the setup audit trail and find the fields in question.
Suggested answer: D

Explanation:

The setup audit trail tracks changes made in your org's setup area for the past 180 days1.You can export the setup audit trail and find the fields in question by filtering by action type, user, or date2.

asked 23/09/2024
Vipul Ishan
39 questions

Question 32

Report
Export
Collapse

Universal Containers wants to automatically archive all inactive Account data that is older than 3 years. The information does not need to remain accessible within the application. Which two methods should be recommended to meet this requirement? Choose 2 answers

Use the Force.com Workbench to export the data.
Use the Force.com Workbench to export the data.
Schedule a weekly export file from the Salesforce UI.
Schedule a weekly export file from the Salesforce UI.
Schedule jobs to export and delete using an ETL tool.
Schedule jobs to export and delete using an ETL tool.
Schedule jobs to export and delete using the Data Loader.
Schedule jobs to export and delete using the Data Loader.
Suggested answer: C, D

Explanation:

Both C and D are valid methods to automatically archive and delete inactive Account data that is older than 3 years1.You can use an ETL tool or the Data Loader to schedule jobs to export and delete data based on certain criteria3. Option A is not recommended because the Force.com Workbench is a web-based tool that does not support scheduling or automation. Option B is not suitable because the weekly export file from the Salesforce UI does not delete data from Salesforce.

asked 23/09/2024
Courage Marume
35 questions

Question 33

Report
Export
Collapse

Cloud Kicks needs to purge detailed transactional records from Salesforce. The data should be aggregated at a summary level and available in Salesforce.

What are two automated approaches to fulfill this goal? (Choose two.)

Third-party Integration Tool (ETL)
Third-party Integration Tool (ETL)
Schedulable Batch Apex
Schedulable Batch Apex
Third-party Business Intelligence system
Third-party Business Intelligence system
Apex Triggers
Apex Triggers
Suggested answer: A, B

Explanation:

Both A and B are automated approaches to purge detailed transactional records from Salesforce and aggregate them at a summary level1. You can use a third-party integration tool (ETL) or schedulable batch Apex to perform these tasks. Option C is not correct because a third-party business intelligence system does not purge data from Salesforce, but only analyzes it. Option D is not correct because Apex triggers are not automated, but execute when a record is inserted, updated, deleted, or undeleted.

asked 23/09/2024
Mitesh Patel
32 questions

Question 34

Report
Export
Collapse

Universal Containers (UC) is concerned that data is being corrupted daily either through negligence or maliciousness. They want to implement a backup strategy to help recover any corrupted data or data mistakenly changed or even deleted. What should the data architect consider when designing a field -level audit and recovery plan?

Reduce data storage by purging old data.
Reduce data storage by purging old data.
Implement an AppExchange package.
Implement an AppExchange package.
Review projected data storage needs.
Review projected data storage needs.
Schedule a weekly export file.
Schedule a weekly export file.
Suggested answer: C

Explanation:

Option C is the best answer because reviewing projected data storage needs is an important step in designing a field-level audit and recovery plan1. You need to estimate how much data storage you will need in the future and plan accordingly. Option A is not correct because reducing data storage by purging old data may not be sufficient or desirable for backup purposes. Option B is not correct because implementing an AppExchange package may not be customized or compatible with your org's requirements. Option D is not correct because scheduling a weekly export file may not be frequent or granular enough for field-level audit and recovery.

asked 23/09/2024
RYAN UBANA
39 questions

Question 35

Report
Export
Collapse

Every year, Ursa Major Solar has more than 1 million orders. Each order contains an average of 10 line items. The Chief Executive Officer (CEO) needs the Sales Reps to see how much money each customer generates year-over-year. However, data storage is running low in Salesforce.

Which approach for data archiving is appropriate for this scenario?

1. Annually export and delete order line items. 2. Store them in a zip file in case the data is needed later.
1. Annually export and delete order line items. 2. Store them in a zip file in case the data is needed later.
1. Annually aggregate order amount data to store in a custom object. 2. Delete those orders and order line items.
1. Annually aggregate order amount data to store in a custom object. 2. Delete those orders and order line items.
1. Annually export and delete orders and order line items. 2. Store them in a zip file in case the data is needed later.
1. Annually export and delete orders and order line items. 2. Store them in a zip file in case the data is needed later.
1. Annually delete orders and order line items. 2. Ensure the customer has order information in another system.
1. Annually delete orders and order line items. 2. Ensure the customer has order information in another system.
Suggested answer: B

Explanation:

Option B is the most appropriate approach for data archiving in this scenario1. By aggregating order amount data at a summary level and storing it in a custom object, you can reduce data storage and still provide visibility to the Sales Reps on how much money each customer generates year-over-year. Option A is not correct because storing data in a zip file does not make it available in Salesforce. Option C is not correct because exporting and deleting orders and order line items may lose important details that are needed for analysis or reporting. Option D is not correct because deleting orders and order line items without exporting them may cause data loss or inconsistency if the customer does not have order information in another system.

asked 23/09/2024
manuele groppi
30 questions

Question 36

Report
Export
Collapse

Get Cloudy Consulting monitors 15,000 servers, and these servers automatically record their status every 10 minutes. Because of company policy, these status reports must be maintained for 5 years. Managers at Get Cloudy Consulting need access to up to one week's worth of these status reports with all of their details.

An Architect is recommending what data should be integrated into Salesforce and for how long it should be stored in Salesforce.

Which two limits should the Architect be aware of? (Choose two.)

Data storage limits
Data storage limits
Workflow rule limits
Workflow rule limits
API Request limits
API Request limits
Webservice callout limits
Webservice callout limits
Suggested answer: A, C

Explanation:

Data storage limits and API request limits are two important factors that affect the data integration and storage in Salesforce. Data storage limits determine how much data can be stored in Salesforce, and API request limits determine how many API calls can be made to Salesforce in a 24-hour period. Both of these limits depend on the edition and license type of the Salesforce org. Workflow rule limits and webservice callout limits are not directly related to data integration and storage, but rather to business logic and external services.

asked 23/09/2024
Ramon Lim
37 questions

Question 37

Report
Export
Collapse

A Salesforce customer has plenty of data storage. Sales Reps are complaining that searches are bringing back old records that aren't relevant any longer. Sales Managers need the data for their historical reporting. What strategy should a data architect use to ensure a better user experience for the Sales Reps?

Create a Permission Set to hide old data from Sales Reps.
Create a Permission Set to hide old data from Sales Reps.
Use Batch Apex to archive old data on a rolling nightly basis.
Use Batch Apex to archive old data on a rolling nightly basis.
Archive and purge old data from Salesforce on a monthly basis.
Archive and purge old data from Salesforce on a monthly basis.
Set data access to Private to hide old data from Sales Reps.
Set data access to Private to hide old data from Sales Reps.
Suggested answer: C

Explanation:

Archiving and purging old data from Salesforce on a monthly basis is a good strategy to improve the user experience for the Sales Reps, as it will reduce the clutter and improve the search performance. Creating a permission set or setting data access to private are not effective ways to hide old data from Sales Reps, as they will still consume data storage and affect search results. Using Batch Apex to archive old data on a rolling nightly basis is also not a good option, as it will consume API requests and processing time, and may not comply with the data retention policy.

asked 23/09/2024
Ian Schraier
35 questions

Question 38

Report
Export
Collapse

Universal Containers (UC) is implementing a formal, cross -business -unit data governance program As part of the program, UC will implement a team to make decisions on enterprise -wide data governance. Which two roles are appropriate as members of this team? Choose 2 answers

Analytics/BI Owners
Analytics/BI Owners
Data Domain Stewards
Data Domain Stewards
Salesforce Administrators
Salesforce Administrators
Operational Data Users
Operational Data Users
Suggested answer: A, B

Explanation:

Analytics/BI Owners and Data Domain Stewards are appropriate roles as members of a team that makes decisions on enterprise-wide data governance. Analytics/BI Owners are responsible for defining the business requirements and metrics for data analysis and reporting, and Data Domain Stewards are responsible for defining and enforcing the data quality standards and rules for specific data domains. Salesforce Administrators and Operational Data Users are not suitable roles for this team, as they are more focused on the operational aspects of data management, such as configuration, maintenance, and usage.

asked 23/09/2024
Yung-Shuen Chang
40 questions

Question 39

Report
Export
Collapse

Universal Containers (UC) has a complex system landscape and is implementing a data governance program for the first time Which two first steps would be appropriate for UC to initiate an assessment of data architecture? Choose 2 answers

Engage with IT program managers to assess current velocity of projects in the pipeline.
Engage with IT program managers to assess current velocity of projects in the pipeline.
Engage with database administrators to assess current database performance metrics.
Engage with database administrators to assess current database performance metrics.
Engage with executive sponsorship to assess enterprise data strategy and goals.
Engage with executive sponsorship to assess enterprise data strategy and goals.
Engage with business units and IT to assess current operational systems and data models.
Engage with business units and IT to assess current operational systems and data models.
Suggested answer: C, D

Explanation:

Engaging with executive sponsorship to assess enterprise data strategy and goals, and engaging with business units and IT to assess current operational systems and data models are two first steps that would be appropriate for UC to initiate an assessment of data architecture. These steps will help to understand the current state of data management, the business needs and expectations, and the gaps and opportunities for improvement. Engaging with IT program managers to assess current velocity of projects in the pipeline, and engaging with database administrators to assess current database performance metrics are not relevant steps for assessing data architecture, as they are more related to project management and technical performance.

asked 23/09/2024
Sharhonda Herman
48 questions

Question 40

Report
Export
Collapse

A data architect has been tasked with optimizing a data stewardship engagement for a Salesforce instance Which three areas of Salesforce should the architect review before proposing any design recommendation? Choose 3 answers

Review the metadata xml files for redundant fields to consolidate.
Review the metadata xml files for redundant fields to consolidate.
Determine if any integration points create records in Salesforce.
Determine if any integration points create records in Salesforce.
Run key reports to determine what fields should be required.
Run key reports to determine what fields should be required.
Export the setup audit trail to review what fields are being used.
Export the setup audit trail to review what fields are being used.
Review the sharing model to determine impact on duplicate records.
Review the sharing model to determine impact on duplicate records.
Suggested answer: B, C, E

Explanation:

Determining if any integration points create records in Salesforce, running key reports to determine what fields should be required, and reviewing the sharing model to determine impact on duplicate records are three areas of Salesforce that the architect should review before proposing any design recommendation. These areas will help to identify the sources and quality of data, the business rules and validations for data entry, and the access and visibility of data across users and roles. Reviewing the metadata xml files for redundant fields to consolidate, and exporting the setup audit trail to review what fields are being used are not necessary steps for optimizing a data stewardship engagement, as they are more related to metadata management and audit tracking.

asked 23/09/2024
- Paulo Fonseca
38 questions
Total 260 questions
Go to page: of 26
Search

Related questions