ExamGecko
Home Home / Salesforce / Certified Data Architect

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 4

Question list
Search
Search

List of questions

Search

Related questions











How can an architect find information about who is creating, changing, or deleting certain fields within the past two months?

A.
Remove 'customize application' permissions from everyone else.
A.
Remove 'customize application' permissions from everyone else.
Answers
B.
Export the metadata and search it for the fields in question.
B.
Export the metadata and search it for the fields in question.
Answers
C.
Create a field history report for the fields in question.
C.
Create a field history report for the fields in question.
Answers
D.
Export the setup audit trail and find the fields in question.
D.
Export the setup audit trail and find the fields in question.
Answers
Suggested answer: D

Explanation:

The setup audit trail tracks changes made in your org's setup area for the past 180 days1.You can export the setup audit trail and find the fields in question by filtering by action type, user, or date2.

Universal Containers wants to automatically archive all inactive Account data that is older than 3 years. The information does not need to remain accessible within the application. Which two methods should be recommended to meet this requirement? Choose 2 answers

A.
Use the Force.com Workbench to export the data.
A.
Use the Force.com Workbench to export the data.
Answers
B.
Schedule a weekly export file from the Salesforce UI.
B.
Schedule a weekly export file from the Salesforce UI.
Answers
C.
Schedule jobs to export and delete using an ETL tool.
C.
Schedule jobs to export and delete using an ETL tool.
Answers
D.
Schedule jobs to export and delete using the Data Loader.
D.
Schedule jobs to export and delete using the Data Loader.
Answers
Suggested answer: C, D

Explanation:

Both C and D are valid methods to automatically archive and delete inactive Account data that is older than 3 years1.You can use an ETL tool or the Data Loader to schedule jobs to export and delete data based on certain criteria3. Option A is not recommended because the Force.com Workbench is a web-based tool that does not support scheduling or automation. Option B is not suitable because the weekly export file from the Salesforce UI does not delete data from Salesforce.

Cloud Kicks needs to purge detailed transactional records from Salesforce. The data should be aggregated at a summary level and available in Salesforce.

What are two automated approaches to fulfill this goal? (Choose two.)

A.
Third-party Integration Tool (ETL)
A.
Third-party Integration Tool (ETL)
Answers
B.
Schedulable Batch Apex
B.
Schedulable Batch Apex
Answers
C.
Third-party Business Intelligence system
C.
Third-party Business Intelligence system
Answers
D.
Apex Triggers
D.
Apex Triggers
Answers
Suggested answer: A, B

Explanation:

Both A and B are automated approaches to purge detailed transactional records from Salesforce and aggregate them at a summary level1. You can use a third-party integration tool (ETL) or schedulable batch Apex to perform these tasks. Option C is not correct because a third-party business intelligence system does not purge data from Salesforce, but only analyzes it. Option D is not correct because Apex triggers are not automated, but execute when a record is inserted, updated, deleted, or undeleted.

Universal Containers (UC) is concerned that data is being corrupted daily either through negligence or maliciousness. They want to implement a backup strategy to help recover any corrupted data or data mistakenly changed or even deleted. What should the data architect consider when designing a field -level audit and recovery plan?

A.
Reduce data storage by purging old data.
A.
Reduce data storage by purging old data.
Answers
B.
Implement an AppExchange package.
B.
Implement an AppExchange package.
Answers
C.
Review projected data storage needs.
C.
Review projected data storage needs.
Answers
D.
Schedule a weekly export file.
D.
Schedule a weekly export file.
Answers
Suggested answer: C

Explanation:

Option C is the best answer because reviewing projected data storage needs is an important step in designing a field-level audit and recovery plan1. You need to estimate how much data storage you will need in the future and plan accordingly. Option A is not correct because reducing data storage by purging old data may not be sufficient or desirable for backup purposes. Option B is not correct because implementing an AppExchange package may not be customized or compatible with your org's requirements. Option D is not correct because scheduling a weekly export file may not be frequent or granular enough for field-level audit and recovery.

Every year, Ursa Major Solar has more than 1 million orders. Each order contains an average of 10 line items. The Chief Executive Officer (CEO) needs the Sales Reps to see how much money each customer generates year-over-year. However, data storage is running low in Salesforce.

Which approach for data archiving is appropriate for this scenario?

A.
1. Annually export and delete order line items. 2. Store them in a zip file in case the data is needed later.
A.
1. Annually export and delete order line items. 2. Store them in a zip file in case the data is needed later.
Answers
B.
1. Annually aggregate order amount data to store in a custom object. 2. Delete those orders and order line items.
B.
1. Annually aggregate order amount data to store in a custom object. 2. Delete those orders and order line items.
Answers
C.
1. Annually export and delete orders and order line items. 2. Store them in a zip file in case the data is needed later.
C.
1. Annually export and delete orders and order line items. 2. Store them in a zip file in case the data is needed later.
Answers
D.
1. Annually delete orders and order line items. 2. Ensure the customer has order information in another system.
D.
1. Annually delete orders and order line items. 2. Ensure the customer has order information in another system.
Answers
Suggested answer: B

Explanation:

Option B is the most appropriate approach for data archiving in this scenario1. By aggregating order amount data at a summary level and storing it in a custom object, you can reduce data storage and still provide visibility to the Sales Reps on how much money each customer generates year-over-year. Option A is not correct because storing data in a zip file does not make it available in Salesforce. Option C is not correct because exporting and deleting orders and order line items may lose important details that are needed for analysis or reporting. Option D is not correct because deleting orders and order line items without exporting them may cause data loss or inconsistency if the customer does not have order information in another system.

Get Cloudy Consulting monitors 15,000 servers, and these servers automatically record their status every 10 minutes. Because of company policy, these status reports must be maintained for 5 years. Managers at Get Cloudy Consulting need access to up to one week's worth of these status reports with all of their details.

An Architect is recommending what data should be integrated into Salesforce and for how long it should be stored in Salesforce.

Which two limits should the Architect be aware of? (Choose two.)

A.
Data storage limits
A.
Data storage limits
Answers
B.
Workflow rule limits
B.
Workflow rule limits
Answers
C.
API Request limits
C.
API Request limits
Answers
D.
Webservice callout limits
D.
Webservice callout limits
Answers
Suggested answer: A, C

Explanation:

Data storage limits and API request limits are two important factors that affect the data integration and storage in Salesforce. Data storage limits determine how much data can be stored in Salesforce, and API request limits determine how many API calls can be made to Salesforce in a 24-hour period. Both of these limits depend on the edition and license type of the Salesforce org. Workflow rule limits and webservice callout limits are not directly related to data integration and storage, but rather to business logic and external services.

A Salesforce customer has plenty of data storage. Sales Reps are complaining that searches are bringing back old records that aren't relevant any longer. Sales Managers need the data for their historical reporting. What strategy should a data architect use to ensure a better user experience for the Sales Reps?

A.
Create a Permission Set to hide old data from Sales Reps.
A.
Create a Permission Set to hide old data from Sales Reps.
Answers
B.
Use Batch Apex to archive old data on a rolling nightly basis.
B.
Use Batch Apex to archive old data on a rolling nightly basis.
Answers
C.
Archive and purge old data from Salesforce on a monthly basis.
C.
Archive and purge old data from Salesforce on a monthly basis.
Answers
D.
Set data access to Private to hide old data from Sales Reps.
D.
Set data access to Private to hide old data from Sales Reps.
Answers
Suggested answer: C

Explanation:

Archiving and purging old data from Salesforce on a monthly basis is a good strategy to improve the user experience for the Sales Reps, as it will reduce the clutter and improve the search performance. Creating a permission set or setting data access to private are not effective ways to hide old data from Sales Reps, as they will still consume data storage and affect search results. Using Batch Apex to archive old data on a rolling nightly basis is also not a good option, as it will consume API requests and processing time, and may not comply with the data retention policy.

Universal Containers (UC) is implementing a formal, cross -business -unit data governance program As part of the program, UC will implement a team to make decisions on enterprise -wide data governance. Which two roles are appropriate as members of this team? Choose 2 answers

A.
Analytics/BI Owners
A.
Analytics/BI Owners
Answers
B.
Data Domain Stewards
B.
Data Domain Stewards
Answers
C.
Salesforce Administrators
C.
Salesforce Administrators
Answers
D.
Operational Data Users
D.
Operational Data Users
Answers
Suggested answer: A, B

Explanation:

Analytics/BI Owners and Data Domain Stewards are appropriate roles as members of a team that makes decisions on enterprise-wide data governance. Analytics/BI Owners are responsible for defining the business requirements and metrics for data analysis and reporting, and Data Domain Stewards are responsible for defining and enforcing the data quality standards and rules for specific data domains. Salesforce Administrators and Operational Data Users are not suitable roles for this team, as they are more focused on the operational aspects of data management, such as configuration, maintenance, and usage.

Universal Containers (UC) has a complex system landscape and is implementing a data governance program for the first time Which two first steps would be appropriate for UC to initiate an assessment of data architecture? Choose 2 answers

A.
Engage with IT program managers to assess current velocity of projects in the pipeline.
A.
Engage with IT program managers to assess current velocity of projects in the pipeline.
Answers
B.
Engage with database administrators to assess current database performance metrics.
B.
Engage with database administrators to assess current database performance metrics.
Answers
C.
Engage with executive sponsorship to assess enterprise data strategy and goals.
C.
Engage with executive sponsorship to assess enterprise data strategy and goals.
Answers
D.
Engage with business units and IT to assess current operational systems and data models.
D.
Engage with business units and IT to assess current operational systems and data models.
Answers
Suggested answer: C, D

Explanation:

Engaging with executive sponsorship to assess enterprise data strategy and goals, and engaging with business units and IT to assess current operational systems and data models are two first steps that would be appropriate for UC to initiate an assessment of data architecture. These steps will help to understand the current state of data management, the business needs and expectations, and the gaps and opportunities for improvement. Engaging with IT program managers to assess current velocity of projects in the pipeline, and engaging with database administrators to assess current database performance metrics are not relevant steps for assessing data architecture, as they are more related to project management and technical performance.

A data architect has been tasked with optimizing a data stewardship engagement for a Salesforce instance Which three areas of Salesforce should the architect review before proposing any design recommendation? Choose 3 answers

A.
Review the metadata xml files for redundant fields to consolidate.
A.
Review the metadata xml files for redundant fields to consolidate.
Answers
B.
Determine if any integration points create records in Salesforce.
B.
Determine if any integration points create records in Salesforce.
Answers
C.
Run key reports to determine what fields should be required.
C.
Run key reports to determine what fields should be required.
Answers
D.
Export the setup audit trail to review what fields are being used.
D.
Export the setup audit trail to review what fields are being used.
Answers
E.
Review the sharing model to determine impact on duplicate records.
E.
Review the sharing model to determine impact on duplicate records.
Answers
Suggested answer: B, C, E

Explanation:

Determining if any integration points create records in Salesforce, running key reports to determine what fields should be required, and reviewing the sharing model to determine impact on duplicate records are three areas of Salesforce that the architect should review before proposing any design recommendation. These areas will help to identify the sources and quality of data, the business rules and validations for data entry, and the access and visibility of data across users and roles. Reviewing the metadata xml files for redundant fields to consolidate, and exporting the setup audit trail to review what fields are being used are not necessary steps for optimizing a data stewardship engagement, as they are more related to metadata management and audit tracking.

Total 260 questions
Go to page: of 26