ExamGecko
Home Home / Amazon / DVA-C01

Amazon DVA-C01 Practice Test - Questions Answers, Page 22

Question list
Search
Search

List of questions

Search

Related questions











AWS CodeBuild builds code for an application, creates the Docker image, pushes the image to Amazon Elastic Container Registry (Amazon ECR), and tags the image with a unique identifier. If the Developers already have AWS CLI configured on their workstations, how can the Docker images be pulled to the workstations?

A.
Run the following:docker pull REPOSITORY URI : TAG
A.
Run the following:docker pull REPOSITORY URI : TAG
Answers
B.
Run the output of the following:aws ecr get-loginand then run:docker pull REPOSITORY URI : TAG
B.
Run the output of the following:aws ecr get-loginand then run:docker pull REPOSITORY URI : TAG
Answers
C.
Run the following:aws ecr get-loginand then run:docker pull REPOSITORY URI : TAG
C.
Run the following:aws ecr get-loginand then run:docker pull REPOSITORY URI : TAG
Answers
D.
Run the output of the following:aws ecr get-download-url-for-layerand then run:docker pull REPOSITORY URI : TAG
D.
Run the output of the following:aws ecr get-download-url-for-layerand then run:docker pull REPOSITORY URI : TAG
Answers
Suggested answer: B

Explanation:

https://docs.aws.amazon.com/cli/latest/reference/ecr/get-login.html

A company caches session information for a web application in an Amazon DynamoDB table. The company wants an automated way to delete old items from the table. What is the simplest way to do this?

A.
Write a script that deletes old records; schedule the scripts as a cron job on an Amazon EC2 instance.
A.
Write a script that deletes old records; schedule the scripts as a cron job on an Amazon EC2 instance.
Answers
B.
Add an attribute with the expiration time; enable the Time To Live feature based on that attribute.
B.
Add an attribute with the expiration time; enable the Time To Live feature based on that attribute.
Answers
C.
Each day, create a new table to hold session data; delete the previous day’s table.
C.
Each day, create a new table to hold session data; delete the previous day’s table.
Answers
D.
Add an attribute with the expiration time; name the attribute ItemExpiration.
D.
Add an attribute with the expiration time; name the attribute ItemExpiration.
Answers
Suggested answer: B

Explanation:

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/time-to-live-ttl-howto.html

An application is expected to process many files. Each file takes four minutes to process each AWS Lambda invocation. The Lambda function does not return any important data. What is the fastest way to process all the files?

A.
First split the files to make them smaller, then process with synchronous RequestResponse Lambda invocations.
A.
First split the files to make them smaller, then process with synchronous RequestResponse Lambda invocations.
Answers
B.
Make synchronous RequestResponse Lambda invocations and process the files one by one.
B.
Make synchronous RequestResponse Lambda invocations and process the files one by one.
Answers
C.
Make asynchronous Event Lambda invocations and process the files in parallel.
C.
Make asynchronous Event Lambda invocations and process the files in parallel.
Answers
D.
First join all the files, then process it all at once with an asynchronous Event Lambda invocation.
D.
First join all the files, then process it all at once with an asynchronous Event Lambda invocation.
Answers
Suggested answer: C


The upload of a 15 GB object to Amazon S3 fails. The error message reads: “Your proposed upload exceeds the maximum allowed object size.” What technique will allow the Developer to upload this object?

A.
Upload the object using the multi-part upload API.
A.
Upload the object using the multi-part upload API.
Answers
B.
Upload the object over an AWS Direct Connect connection.
B.
Upload the object over an AWS Direct Connect connection.
Answers
C.
Contact AWS Support to increase the object size limit.
C.
Contact AWS Support to increase the object size limit.
Answers
D.
Upload the object to another AWS region.
D.
Upload the object to another AWS region.
Answers
Suggested answer: A

Explanation:

https://docs.aws.amazon.com/AmazonS3/latest/dev/UploadingObjects.html

A company has an AWS CloudFormation template that is stored as a single file. The template is able to launch and create a full infrastructure stack. Which best practice would increase the maintainability of the template?

A.
Use nested stacks for common template patterns.
A.
Use nested stacks for common template patterns.
Answers
B.
Embed credentials to prevent typos.
B.
Embed credentials to prevent typos.
Answers
C.
Remove mappings to decrease the number of variables.
C.
Remove mappings to decrease the number of variables.
Answers
D.
Use AWS::Include to reference publicly-hosted template files.
D.
Use AWS::Include to reference publicly-hosted template files.
Answers
Suggested answer: A

A Developer wants to encrypt new objects that are being uploaded to an Amazon S3 bucket by an application. There must be an audit trail of who has used the key during this process. There should be no change to the performance of the application.

Which type of encryption meets these requirements?

A.
Server-side encryption using S3-managed keys
A.
Server-side encryption using S3-managed keys
Answers
B.
Server-side encryption with AWS KMS-managed keys
B.
Server-side encryption with AWS KMS-managed keys
Answers
C.
Client-side encryption with a client-side symmetric master key
C.
Client-side encryption with a client-side symmetric master key
Answers
D.
Client-side encryption with AWS KMS-managed keys
D.
Client-side encryption with AWS KMS-managed keys
Answers
Suggested answer: B

An on-premises application makes repeated calls to store files to Amazon S3. As usage of the application has increased, “LimitExceeded” errors are being logged. What should be changed to fix this error?

A.
Implement exponential backoffs in the application.
A.
Implement exponential backoffs in the application.
Answers
B.
Load balance the application to multiple servers.
B.
Load balance the application to multiple servers.
Answers
C.
Move the application to Amazon EC2.
C.
Move the application to Amazon EC2.
Answers
D.
Add a one second delay to each API call.
D.
Add a one second delay to each API call.
Answers
Suggested answer: A

An organization is storing large files in Amazon S3, and is writing a web application to display metadata about the files to end-users. Based on the metadata a user selects an object to download. The organization needs a mechanism to index the files and provide single-digit millisecond latency retrieval for the metadata.

What AWS service should be used to accomplish this?

A.
Amazon DynamoDB
A.
Amazon DynamoDB
Answers
B.
Amazon EC2
B.
Amazon EC2
Answers
C.
AWS Lambda
C.
AWS Lambda
Answers
D.
Amazon RDS
D.
Amazon RDS
Answers
Suggested answer: A

Explanation:

Explanation:

Amazon DynamoDB is a fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale. It is a fully managed database and supports both document and key-value data models. Its flexible data model and reliable performance make it a great fit for mobile, web, gaming, ad-tech, Internet of Things (IoT), and many other applications. References:

While developing an application that runs on Amazon EC2 in an Amazon VPC, a Developer identifies the need for centralized storage of application-level logs. Which AWS service can be used to securely store these logs?

A.
Amazon EC2 VPC Flow Logs
A.
Amazon EC2 VPC Flow Logs
Answers
B.
Amazon CloudWatch Logs
B.
Amazon CloudWatch Logs
Answers
C.
Amazon CloudSearch
C.
Amazon CloudSearch
Answers
D.
AWS CloudTrail
D.
AWS CloudTrail
Answers
Suggested answer: B

A stock market monitoring application uses Amazon Kinesis for data ingestion. During simulated tests of peak data rates, the Kinesis stream cannot keep up with the incoming data. What step will allow Kinesis to accommodate the traffic during peak hours?

A.
Install the Kinesis Producer Library (KPL) for ingesting data into the stream.
A.
Install the Kinesis Producer Library (KPL) for ingesting data into the stream.
Answers
B.
Reduce the data retention period to allow for more data ingestion using DecreaseStreamRetentionPeriod.
B.
Reduce the data retention period to allow for more data ingestion using DecreaseStreamRetentionPeriod.
Answers
C.
Increase the shard count of the stream using UpdateShardCount.
C.
Increase the shard count of the stream using UpdateShardCount.
Answers
D.
Ingest multiple records into the stream in a single call using PutRecords.
D.
Ingest multiple records into the stream in a single call using PutRecords.
Answers
Suggested answer: C

Explanation:

https://docs.aws.amazon.com/streams/latest/dev/developing-producers-with-kpl.html

Total 608 questions
Go to page: of 61