ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 630 - SAP-C01 discussion

Report
Export

A company is running an Apache Hadoop cluster on Amazon EC2 instances. The Hadoop cluster stores approximately 100 TB of data for weekly operational reports and allows occasional access for data scientists to retrieve data. The company needs to reduce the cost and operational complexity for storing and serving this data.

Which solution meets these requirements in the MOST cost-effective manner?

A.
Move the Hadoop cluster from EC2 instances to Amazon EMR. Allow data access patterns to remain the same.
Answers
A.
Move the Hadoop cluster from EC2 instances to Amazon EMR. Allow data access patterns to remain the same.
B.
Write a script that resizes the EC2 instances to a smaller instance type during downtime and resizes the instances to a larger instance type before the reports are created.
Answers
B.
Write a script that resizes the EC2 instances to a smaller instance type during downtime and resizes the instances to a larger instance type before the reports are created.
C.
Move the data to Amazon S3 and use Amazon Athena to query the data for reports. Allow the data scientists to access the data directly in Amazon S3.
Answers
C.
Move the data to Amazon S3 and use Amazon Athena to query the data for reports. Allow the data scientists to access the data directly in Amazon S3.
D.
Migrate the data to Amazon DynamoDB and modify the reports to fetch data from DynamoDB. Allow the data scientists to access the data directly in DynamoDB.
Answers
D.
Migrate the data to Amazon DynamoDB and modify the reports to fetch data from DynamoDB. Allow the data scientists to access the data directly in DynamoDB.
Suggested answer: C
asked 16/09/2024
Novka Mandic
35 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first