ExamGecko
Question list
Search
Search

List of questions

Search

Related questions











Question 445 - SAP-C02 discussion

Report
Export

A company has an application that analyzes and stores image data on premises The application receives millions of new image files every day Files are an average of 1 MB in size The files are analyzed in batches of 1 GB When the application analyzes a batch the application zips the images together The application then archives the images as a single file in an on-premises NFS server for long-term storage

The company has a Microsoft Hyper-V environment on premises and has compute capacity available The company does not have storage capacity and wants to archive the images on AWS The company needs the ability to retrieve archived data within t week of a request.

The company has a 10 Gbps AWS Direct Connect connection between its on-premises data center and AWS. The company needs to set bandwidth limits and schedule archived images to be copied to AWS dunng non-business hours.

Which solution will meet these requirements MOST cost-effectively?

A.
Deploy an AWS DataSync agent on a new GPU-based Amazon EC2 instance Configure the DataSync agent to copy the batch of files from the NFS on-premises server to Amazon S3 Glacier Instant Retrieval After the successful copy delete the data from the on-premises storage
Answers
A.
Deploy an AWS DataSync agent on a new GPU-based Amazon EC2 instance Configure the DataSync agent to copy the batch of files from the NFS on-premises server to Amazon S3 Glacier Instant Retrieval After the successful copy delete the data from the on-premises storage
B.
Deploy an AWS DataSync agent as a Hyper-V VM on premises Configure the DataSync agent to copy the batch of files from the NFS on-premises server to Amazon S3 Glacier Deep Archive After the successful copy delete the data from the on-premises storage
Answers
B.
Deploy an AWS DataSync agent as a Hyper-V VM on premises Configure the DataSync agent to copy the batch of files from the NFS on-premises server to Amazon S3 Glacier Deep Archive After the successful copy delete the data from the on-premises storage
C.
Deploy an AWS DataSync agent on a new general purpose Amazon EC2 instance Configure the DataSync agent to copy the batch of files from the NFS on-premises server to Amazon S3 Standard After the successful copy deletes the data from the on-premises storage Create an S3 Lifecycle rule to transition objects from S3 Standard to S3 Glacier Deep Archive after 1 day
Answers
C.
Deploy an AWS DataSync agent on a new general purpose Amazon EC2 instance Configure the DataSync agent to copy the batch of files from the NFS on-premises server to Amazon S3 Standard After the successful copy deletes the data from the on-premises storage Create an S3 Lifecycle rule to transition objects from S3 Standard to S3 Glacier Deep Archive after 1 day
D.
Deploy an AWS Storage Gateway Tape Gateway on premises in the Hyper-V environment Connect the Tape Gateway to AWS Use automatic tape creation Specify an Amazon S3 Glacier Deep Archive pool Eject the tape after the batch of images is copied
Answers
D.
Deploy an AWS Storage Gateway Tape Gateway on premises in the Hyper-V environment Connect the Tape Gateway to AWS Use automatic tape creation Specify an Amazon S3 Glacier Deep Archive pool Eject the tape after the batch of images is copied
Suggested answer: B

Explanation:

Deploy DataSync Agent:

Install the AWS DataSync agent as a VM in your Hyper-V environment. This agent facilitates the data transfer between your on-premises storage and AWS.

Configure Source and Destination:

Set up the source location to point to your on-premises NFS server where the image batches are stored.

Configure the destination location to be an Amazon S3 bucket with the Glacier Deep Archive storage class. This storage class is cost-effective for long-term storage with retrieval times of up to 12 hours.

Create DataSync Tasks:

Create and configure DataSync tasks to manage the data transfer. Schedule these tasks to run during non-business hours to minimize bandwidth usage during peak times. The tasks will handle the copying of data batches from the NFS server to the S3 bucket.

Set Bandwidth Limits:

In the DataSync configuration, set bandwidth limits to control the amount of data being transferred at any given time. This ensures that your network's performance is not adversely affected during business hours.

Delete On-Premises Data:

After successfully copying the data to S3 Glacier Deep Archive, configure the DataSync task to delete the data from your on-premises NFS server. This helps manage storage capacity on-premises and ensures data is securely archived on AWS.

This approach leverages AWS DataSync for efficient, secure, and automated data transfer, and S3 Glacier Deep Archive for cost-effective long-term storage.

Reference

AWS DataSync Overview41.

AWS Storage Blog on DataSync Migration40.

Amazon S3 Transfer Acceleration Documentation42.

asked 16/09/2024
Sérgio Filipe Soares
43 questions
User
Your answer:
0 comments
Sorted by

Leave a comment first