ExamGecko
Home / Salesforce / Certified Data Architect / List of questions
Ask Question

Salesforce Certified Data Architect Practice Test - Questions Answers, Page 7

List of questions

Question 61

Report
Export
Collapse

Developers at Universal Containers need to build a report for the business which displays Accounts opened in the past year grouped by industry. This report will also include information from contacts, opportunities, and orders. There are several million Accounts in the system. Which two options should be recommended to make this report perform well and satisfy the business need?

Use triggers to populate denormalized related fields on the Account.
Use triggers to populate denormalized related fields on the Account.
Use an indexed data field with bounded data filters.
Use an indexed data field with bounded data filters.
Use Formula fields to surface information I related entities on the report.
Use Formula fields to surface information I related entities on the report.
Use unbounded date ranges to filter the report.
Use unbounded date ranges to filter the report.
Suggested answer: B, C

Explanation:

Using an indexed data field with bounded data filters can improve the report performance by making the query more selective and reducing the number of records to scan.Using formula fields to surface information from related entities on the report can also enhance the report performance by avoiding joins and complex calculations

asked 23/09/2024
Freddy KUBIAK
49 questions

Question 62

Report
Export
Collapse

A customer wishes to migrate 700,000 Account records in a single migration into Salesforce. What is the recommended solution to migrate these records while minimizing migration time?

Use Salesforce Soap API in parallel mode.
Use Salesforce Soap API in parallel mode.
Use Salesforce Bulk API in serial mode.
Use Salesforce Bulk API in serial mode.
Use Salesforce Bulk API in parallel mode.
Use Salesforce Bulk API in parallel mode.
Use Salesforce Soap API in serial mode.
Use Salesforce Soap API in serial mode.
Suggested answer: C

Explanation:

Using Salesforce Bulk API in parallel mode can reduce the migration time by processing multiple batches of records simultaneously and leveraging the server resources more efficiently.The Bulk API is designed for loading large amounts of data into Salesforce

asked 23/09/2024
chitranjan ranga
33 questions

Question 63

Report
Export
Collapse

Universal Containers has millions of rows of data in Salesforce that are being used in reports to evaluate historical trends. Performance has become an issue, as well as data storage limits. Which two strategies should be recommended when talking with stakeholders?

Use scheduled batch Apex to copy aggregate information into a custom object and delete the original records.
Use scheduled batch Apex to copy aggregate information into a custom object and delete the original records.
Combine Analytics Snapshots with a purging plan by reporting on the snapshot data and deleting the original records.
Combine Analytics Snapshots with a purging plan by reporting on the snapshot data and deleting the original records.
Use Data Loader to extract data, aggregate it, and write it back to a custom object, then delete the original records.
Use Data Loader to extract data, aggregate it, and write it back to a custom object, then delete the original records.
Configure the Salesforce Archiving feature to archive older records and remove them from the data storage limits.
Configure the Salesforce Archiving feature to archive older records and remove them from the data storage limits.
Suggested answer: A, D

Explanation:

Using scheduled batch Apex to copy aggregate information into a custom object and delete the original records can improve the performance and reduce the data storage limits by removing unnecessary data and keeping only the summary data that is needed for reporting.Configuring the Salesforce Archiving feature to archive older records and remove them from the data storage limits can also help with performance and storage issues by moving historical data to a separate system that is still accessible but does not affect the operational data

asked 23/09/2024
Tom Rez
35 questions

Question 64

Report
Export
Collapse

Universal Containers (UC) has implemented Sales Cloud and it has been noticed that Sales reps are not entering enough data to run insightful reports and dashboards. UC executives would like to monitor and measure data quality metrics. What solution addresses this requirement?

Use third-party AppExchange tools to monitor and measure data quality.
Use third-party AppExchange tools to monitor and measure data quality.
Generate reports to view the quality of sample data.
Generate reports to view the quality of sample data.
Use custom objects and fields to calculate data quality.
Use custom objects and fields to calculate data quality.
Export the data to an enterprise data warehouse and use BI tools for data quality.
Export the data to an enterprise data warehouse and use BI tools for data quality.
Suggested answer: A

Explanation:

Using third-party AppExchange tools to monitor and measure data quality can address the requirement of UC executives by providing features such as data cleansing, deduplication, validation, enrichment, and scoring. These tools can help improve the accuracy, completeness, and consistency of the data entered by sales reps .

asked 23/09/2024
Arkadiusz Skopinski
40 questions

Question 65

Report
Export
Collapse

A shipping and logistics company has created a large number of reports within Sales Cloud since Salesforce was introduced. Some of these reports analyze large amounts of data regarding the whereabouts of the company's containers, and they are starting to time out when users are trying to run the reports. What is a recommended approach to avoid these time-out issues?

Improve reporting performance by creating a custom Visualforce report that is using a cache of the records in the report.
Improve reporting performance by creating a custom Visualforce report that is using a cache of the records in the report.
Improve reporting performance by replacing the existing reports in Sales Cloud with new reports based on Analytics Cloud.
Improve reporting performance by replacing the existing reports in Sales Cloud with new reports based on Analytics Cloud.
Improve reporting performance by creating an Apex trigger for the Report object that will pre-fetch data before the report is run.
Improve reporting performance by creating an Apex trigger for the Report object that will pre-fetch data before the report is run.
Improve reporting performance by creating a dashboard that is scheduled to run the reports only once per day.
Improve reporting performance by creating a dashboard that is scheduled to run the reports only once per day.
Suggested answer: B

Explanation:

Improving reporting performance by replacing the existing reports in Sales Cloud with new reports based on Analytics Cloud can avoid the time-out issues by leveraging the power and scalability of Analytics Cloud. Analytics Cloud can handle large volumes of data and provide faster and more interactive reports than Sales Cloud .

asked 23/09/2024
Vageesh Shanmukha
48 questions

Question 66

Report
Export
Collapse

Due to security requirements, Universal Containers needs to capture specific user actions, such as login, logout, file attachment download, package install, etc. What is the recommended approach for defining a solution for this requirement?

Use a field audit trail to capture field changes.
Use a field audit trail to capture field changes.
Use a custom object and trigger to capture changes.
Use a custom object and trigger to capture changes.
Use Event Monitoring to capture these changes.
Use Event Monitoring to capture these changes.
Use a third-party AppExchange app to capture changes.
Use a third-party AppExchange app to capture changes.
Suggested answer: C

Explanation:

Event Monitoring is a feature that allows you to track user actions, such as logins, logouts, downloads, etc., in your Salesforce org.You can use Event Monitoring to monitor performance, usage, security, and compliance

asked 23/09/2024
Rob Kennis
31 questions

Question 67

Report
Export
Collapse

Universal Containers (UC) is concerned about the accuracy of their Customer information in Salesforce. They have recently created an enterprise-wide trusted source MDM for Customer data which they have certified to be accurate. UC has over 20 million unique customer records in the trusted source and Salesforce. What should an Architect recommend to ensure the data in Salesforce is identical to the MDM?

Extract the Salesforce data into Excel and manually compare this against the trusted source.
Extract the Salesforce data into Excel and manually compare this against the trusted source.
Load the Trusted Source data into Salesforce and run an Apex Batch job to find difference.
Load the Trusted Source data into Salesforce and run an Apex Batch job to find difference.
Use an AppExchange package for Data Quality to match Salesforce data against the Trusted source.
Use an AppExchange package for Data Quality to match Salesforce data against the Trusted source.
Leave the data in Salesforce alone and assume that it will auto-correct itself over time.
Leave the data in Salesforce alone and assume that it will auto-correct itself over time.
Suggested answer: C

Explanation:

Using an AppExchange package for Data Quality is a good way to match Salesforce data against a trusted source, such as an MDM system.You can use tools like Cloudingo, DupeCatcher, or DemandTools to identify and merge duplicate records, standardize data formats, and enrich data with external sources

asked 23/09/2024
Antonio Carlos Figueiredo Junior
50 questions

Question 68

Report
Export
Collapse

Which two statements are accurate with respect to performance testing a Force.com application?

All Force.com applications must be performance tested in a sandbox as well as production.
All Force.com applications must be performance tested in a sandbox as well as production.
A performance test plan must be created and submitted to Salesforce customer support.
A performance test plan must be created and submitted to Salesforce customer support.
Applications with highly customized code or large volumes should be performance tested.
Applications with highly customized code or large volumes should be performance tested.
Application performance benchmarked in a sandbox can also be expected in production.
Application performance benchmarked in a sandbox can also be expected in production.
Suggested answer: B, C

Explanation:

A performance test plan is required for any Force.com application that has highly customized code or large volumes of data.You need to create and submit a performance test plan to Salesforce customer support before conducting any performance testing in your sandbox or production org2.Applications with highly customized code or large volumes of data should be performance tested to ensure they meet the expected response time and throughput

asked 23/09/2024
Wissem GHARBI
33 questions

Question 69

Report
Export
Collapse

Universal Containers (UC) is launching an RFP to acquire a new accounting product available on AppExchange. UC is expecting to issue 5 million invoices per year, with each invoice containing an average of 10 line items. What should UC's Data Architect recommend to ensure scalability?

Ensure invoice line items simply reference existing Opportunity line items.
Ensure invoice line items simply reference existing Opportunity line items.
Ensure the account product vendor includes Wave Analytics in their offering.
Ensure the account product vendor includes Wave Analytics in their offering.
Ensure the account product vendor provides a sound data archiving strategy.
Ensure the account product vendor provides a sound data archiving strategy.
Ensure the accounting product runs 100% natively on the Salesforce platform.
Ensure the accounting product runs 100% natively on the Salesforce platform.
Suggested answer: C

Explanation:

A sound data archiving strategy is essential for ensuring scalability and performance of any application that deals with large volumes of data.You need to consider factors like data retention policies, storage limits, backup and recovery options, and data access requirements when designing a data archiving solution2.You can use tools like OwnBackup, Odaseva, or ArchiveIt to archive your data from Salesforce to external systems

asked 23/09/2024
RAHULREDDY BIRADAVOLU
42 questions

Question 70

Report
Export
Collapse

Universal Containers (UC) has a custom discount request object set as a detail object with a custom product object as the master. There is a requirement to allow the creation of generic discount requests without the custom product object as its master record. What solution should an Architect recommend to UC?

Mandate the selection of a custom product for each discount request.
Mandate the selection of a custom product for each discount request.
Create a placeholder product record for the generic discount request.
Create a placeholder product record for the generic discount request.
Remove the master-detail relationship and keep the objects separate.
Remove the master-detail relationship and keep the objects separate.
Change the master-detail relationship to a lookup relationship.
Change the master-detail relationship to a lookup relationship.
Suggested answer: D

Explanation:

Changing the master-detail relationship to a lookup relationship is the best solution for allowing the creation of generic discount requests without the custom product object as its master record. A lookup relationship allows you to create child records without requiring a parent record.It also gives you more flexibility in defining the sharing and security settings for each object

asked 23/09/2024
Fabio Zannetti
36 questions
Total 260 questions
Go to page: of 26
Search

Related questions