ExamGecko
Home / Amazon / DVA-C01 / List of questions
Ask Question

Amazon DVA-C01 Practice Test - Questions Answers, Page 22

List of questions

Question 211

Report
Export
Collapse

AWS CodeBuild builds code for an application, creates the Docker image, pushes the image to Amazon Elastic Container Registry (Amazon ECR), and tags the image with a unique identifier. If the Developers already have AWS CLI configured on their workstations, how can the Docker images be pulled to the workstations?

Run the following:docker pull REPOSITORY URI : TAG
Run the following:docker pull REPOSITORY URI : TAG
Run the output of the following:aws ecr get-loginand then run:docker pull REPOSITORY URI : TAG
Run the output of the following:aws ecr get-loginand then run:docker pull REPOSITORY URI : TAG
Run the following:aws ecr get-loginand then run:docker pull REPOSITORY URI : TAG
Run the following:aws ecr get-loginand then run:docker pull REPOSITORY URI : TAG
Run the output of the following:aws ecr get-download-url-for-layerand then run:docker pull REPOSITORY URI : TAG
Run the output of the following:aws ecr get-download-url-for-layerand then run:docker pull REPOSITORY URI : TAG
Suggested answer: B

Explanation:

https://docs.aws.amazon.com/cli/latest/reference/ecr/get-login.html

asked 16/09/2024
John Doe
36 questions

Question 212

Report
Export
Collapse

A company caches session information for a web application in an Amazon DynamoDB table. The company wants an automated way to delete old items from the table. What is the simplest way to do this?

Write a script that deletes old records; schedule the scripts as a cron job on an Amazon EC2 instance.
Write a script that deletes old records; schedule the scripts as a cron job on an Amazon EC2 instance.
Add an attribute with the expiration time; enable the Time To Live feature based on that attribute.
Add an attribute with the expiration time; enable the Time To Live feature based on that attribute.
Each day, create a new table to hold session data; delete the previous day’s table.
Each day, create a new table to hold session data; delete the previous day’s table.
Add an attribute with the expiration time; name the attribute ItemExpiration.
Add an attribute with the expiration time; name the attribute ItemExpiration.
Suggested answer: B

Explanation:

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/time-to-live-ttl-howto.html

asked 16/09/2024
Mauro Daniele
37 questions

Question 213

Report
Export
Collapse

An application is expected to process many files. Each file takes four minutes to process each AWS Lambda invocation. The Lambda function does not return any important data. What is the fastest way to process all the files?

First split the files to make them smaller, then process with synchronous RequestResponse Lambda invocations.
First split the files to make them smaller, then process with synchronous RequestResponse Lambda invocations.
Make synchronous RequestResponse Lambda invocations and process the files one by one.
Make synchronous RequestResponse Lambda invocations and process the files one by one.
Make asynchronous Event Lambda invocations and process the files in parallel.
Make asynchronous Event Lambda invocations and process the files in parallel.
First join all the files, then process it all at once with an asynchronous Event Lambda invocation.
First join all the files, then process it all at once with an asynchronous Event Lambda invocation.
Suggested answer: C
asked 16/09/2024
Arslan Sheik
37 questions

Question 214

Report
Export
Collapse


The upload of a 15 GB object to Amazon S3 fails. The error message reads: “Your proposed upload exceeds the maximum allowed object size.” What technique will allow the Developer to upload this object?

Upload the object using the multi-part upload API.
Upload the object using the multi-part upload API.
Upload the object over an AWS Direct Connect connection.
Upload the object over an AWS Direct Connect connection.
Contact AWS Support to increase the object size limit.
Contact AWS Support to increase the object size limit.
Upload the object to another AWS region.
Upload the object to another AWS region.
Suggested answer: A

Explanation:

https://docs.aws.amazon.com/AmazonS3/latest/dev/UploadingObjects.html

asked 16/09/2024
Russell James
38 questions

Question 215

Report
Export
Collapse

A company has an AWS CloudFormation template that is stored as a single file. The template is able to launch and create a full infrastructure stack. Which best practice would increase the maintainability of the template?

Use nested stacks for common template patterns.
Use nested stacks for common template patterns.
Embed credentials to prevent typos.
Embed credentials to prevent typos.
Remove mappings to decrease the number of variables.
Remove mappings to decrease the number of variables.
Use AWS::Include to reference publicly-hosted template files.
Use AWS::Include to reference publicly-hosted template files.
Suggested answer: A
asked 16/09/2024
Rudy Raijmakers
40 questions

Question 216

Report
Export
Collapse

A Developer wants to encrypt new objects that are being uploaded to an Amazon S3 bucket by an application. There must be an audit trail of who has used the key during this process. There should be no change to the performance of the application.

Which type of encryption meets these requirements?

Server-side encryption using S3-managed keys
Server-side encryption using S3-managed keys
Server-side encryption with AWS KMS-managed keys
Server-side encryption with AWS KMS-managed keys
Client-side encryption with a client-side symmetric master key
Client-side encryption with a client-side symmetric master key
Client-side encryption with AWS KMS-managed keys
Client-side encryption with AWS KMS-managed keys
Suggested answer: B
asked 16/09/2024
Roman Roman
35 questions

Question 217

Report
Export
Collapse

An on-premises application makes repeated calls to store files to Amazon S3. As usage of the application has increased, “LimitExceeded” errors are being logged. What should be changed to fix this error?

Implement exponential backoffs in the application.
Implement exponential backoffs in the application.
Load balance the application to multiple servers.
Load balance the application to multiple servers.
Move the application to Amazon EC2.
Move the application to Amazon EC2.
Add a one second delay to each API call.
Add a one second delay to each API call.
Suggested answer: A
asked 16/09/2024
Scott Lerch
27 questions

Question 218

Report
Export
Collapse

An organization is storing large files in Amazon S3, and is writing a web application to display metadata about the files to end-users. Based on the metadata a user selects an object to download. The organization needs a mechanism to index the files and provide single-digit millisecond latency retrieval for the metadata.

What AWS service should be used to accomplish this?

Amazon DynamoDB
Amazon DynamoDB
Amazon EC2
Amazon EC2
AWS Lambda
AWS Lambda
Amazon RDS
Amazon RDS
Suggested answer: A

Explanation:

Explanation:

Amazon DynamoDB is a fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale. It is a fully managed database and supports both document and key-value data models. Its flexible data model and reliable performance make it a great fit for mobile, web, gaming, ad-tech, Internet of Things (IoT), and many other applications. References:

asked 16/09/2024
Brian Wilson
37 questions

Question 219

Report
Export
Collapse

While developing an application that runs on Amazon EC2 in an Amazon VPC, a Developer identifies the need for centralized storage of application-level logs. Which AWS service can be used to securely store these logs?

Amazon EC2 VPC Flow Logs
Amazon EC2 VPC Flow Logs
Amazon CloudWatch Logs
Amazon CloudWatch Logs
Amazon CloudSearch
Amazon CloudSearch
AWS CloudTrail
AWS CloudTrail
Suggested answer: B
asked 16/09/2024
Musoke Kamuzze
30 questions

Question 220

Report
Export
Collapse

A stock market monitoring application uses Amazon Kinesis for data ingestion. During simulated tests of peak data rates, the Kinesis stream cannot keep up with the incoming data. What step will allow Kinesis to accommodate the traffic during peak hours?

Install the Kinesis Producer Library (KPL) for ingesting data into the stream.
Install the Kinesis Producer Library (KPL) for ingesting data into the stream.
Reduce the data retention period to allow for more data ingestion using DecreaseStreamRetentionPeriod.
Reduce the data retention period to allow for more data ingestion using DecreaseStreamRetentionPeriod.
Increase the shard count of the stream using UpdateShardCount.
Increase the shard count of the stream using UpdateShardCount.
Ingest multiple records into the stream in a single call using PutRecords.
Ingest multiple records into the stream in a single call using PutRecords.
Suggested answer: C

Explanation:

https://docs.aws.amazon.com/streams/latest/dev/developing-producers-with-kpl.html

asked 16/09/2024
Linda Müller
38 questions
Total 608 questions
Go to page: of 61
Search

Related questions