ExamGecko
Home Home / Amazon / DVA-C01

Amazon DVA-C01 Practice Test - Questions Answers, Page 40

Question list
Search
Search

List of questions

Search

Related questions











A company is launching a poling application. The application will store the results of each pool an Amazon DynamoDB table. Management wants to remove pool data after a few data and store an archive of those records in Amazon S3. Which approach would allow the application to archive each poll's data while keeping complexity to a MINIMUM?

A.
Enable Time to Live (TTL) on the DynamoDB table. Enable DynamoDB Streams on the table and store the records removed from the stream in Amazon S3.
A.
Enable Time to Live (TTL) on the DynamoDB table. Enable DynamoDB Streams on the table and store the records removed from the stream in Amazon S3.
Answers
B.
Schedule an AWS Lambda function to periodically scan the DynamoDB table. Use the BatchWritten operation to delete the results of a scan Enable DynamoDB Stream on the table and store the records removed from the stream in Amazon S3.
B.
Schedule an AWS Lambda function to periodically scan the DynamoDB table. Use the BatchWritten operation to delete the results of a scan Enable DynamoDB Stream on the table and store the records removed from the stream in Amazon S3.
Answers
C.
Enable DynamoDB Streams on the table. Configure the steam as trigger for AWS Lambda. Save records to Amazon S3 when records on the stream are modified.
C.
Enable DynamoDB Streams on the table. Configure the steam as trigger for AWS Lambda. Save records to Amazon S3 when records on the stream are modified.
Answers
D.
Enable cross-Region replication on the S3 bucket to achieve the poll data.
D.
Enable cross-Region replication on the S3 bucket to achieve the poll data.
Answers
Suggested answer: C

A developer is designing a distributed application built using a microservices architect spanning multiple AWS accounts. The company's operations team wants to analyze and debug application issues from a centralized account. How can the developer meet these requirements?

A.
Use an Amazon X-Ray agent with role assumption on to publish data into the centralized account.
A.
Use an Amazon X-Ray agent with role assumption on to publish data into the centralized account.
Answers
B.
Use Amazon X-Ray and create a new IAM user to publish the access keys into the centralized account.
B.
Use Amazon X-Ray and create a new IAM user to publish the access keys into the centralized account.
Answers
C.
Use VPC Flow Logs to collect application logs across different accounts.
C.
Use VPC Flow Logs to collect application logs across different accounts.
Answers
D.
Enable AWS CloudTrail to publish the trails in an Amazon S3 bucket in the centralized account.
D.
Enable AWS CloudTrail to publish the trails in an Amazon S3 bucket in the centralized account.
Answers
Suggested answer: A

What is required to trace Lambda-based applications with AWS X-Ray?

A.
Send logs from the Lambda application to an S3 bucket trigger a Lambda function from that bucket to send data to AWS X-Ray.
A.
Send logs from the Lambda application to an S3 bucket trigger a Lambda function from that bucket to send data to AWS X-Ray.
Answers
B.
Trigger a Lambda function from the application logs in Amazon CloudWatch to submit tracing data to AWS X-Ray
B.
Trigger a Lambda function from the application logs in Amazon CloudWatch to submit tracing data to AWS X-Ray
Answers
C.
Use an IAM execution role to give the Lambda function permissions and enabled tracing.
C.
Use an IAM execution role to give the Lambda function permissions and enabled tracing.
Answers
D.
Update and add AWS X-ray daemon code to relevant parts of the Lambda function to set up the trace.
D.
Update and add AWS X-ray daemon code to relevant parts of the Lambda function to set up the trace.
Answers
Suggested answer: D

A developer is creating an application to process a large number of requests Requests must be processed in order, and each request should be processed only once How should Amazon SQS be deployed to achieve this?

A.
Configure First in First out (FIFO) delivery in a standard Amazon SQS queue to process requests.
A.
Configure First in First out (FIFO) delivery in a standard Amazon SQS queue to process requests.
Answers
B.
Use an SQS FIFO queue to process requests
B.
Use an SQS FIFO queue to process requests
Answers
C.
Use the SetOrder attribute to ensure sequential request processing
C.
Use the SetOrder attribute to ensure sequential request processing
Answers
D.
Convert the standard queue to a FIFO queue by renaming the queue to use the fifo suffix.
D.
Convert the standard queue to a FIFO queue by renaming the queue to use the fifo suffix.
Answers
Suggested answer: B


A developer must modify an Alexa skill backed by an AWS Lambda function to access an Amazon DynamoDB table in a second account A role in the second account has been created with permissions to access the table How should the table be accessed?

A.
Modify the Lambda function execution role's permissions to include the new role
A.
Modify the Lambda function execution role's permissions to include the new role
Answers
B.
Change the Lambda function execution role to be the new role
B.
Change the Lambda function execution role to be the new role
Answers
C.
Assume the new role in the Lambda function when accessing the table
C.
Assume the new role in the Lambda function when accessing the table
Answers
D.
Store the access key and the secret key for the new role and use them when accessing the table
D.
Store the access key and the secret key for the new role and use them when accessing the table
Answers
Suggested answer: A

A developer Is designing an AWS Lambda function that create temporary files that are less than 10 MB during execution. The temporary files will be accessed and modified multiple times during execution. The developer has no need to save or retrieve these files in the future.

Where should the temporary file be stored?

A.
the /tmp directory
A.
the /tmp directory
Answers
B.
Amazon EFS
B.
Amazon EFS
Answers
C.
Amazon EBS
C.
Amazon EBS
Answers
D.
Amazon S3
D.
Amazon S3
Answers
Suggested answer: A

A video-hosting website has two types of members: those who pay a fee. and those who do not Each video upload places a message in Amazon SQS A fleet of Amazon EC2 instances polls Amazon SQS and processes each video The developer needs to ensure that the videos uploaded by the paying members are processed first How can the developer meet this requirement?

A.
Create two SQS queues: one for paying members, and one for non-paying members Poll the paying member queue first and then poll the non-paying member queue
A.
Create two SQS queues: one for paying members, and one for non-paying members Poll the paying member queue first and then poll the non-paying member queue
Answers
B.
Use SQS to set priorities on individual items within a single queue: give the paying members' videos the highest priority.
B.
Use SQS to set priorities on individual items within a single queue: give the paying members' videos the highest priority.
Answers
C.
Use SQS to set priorities on individual items within a single queue and use Amazon SNS to encode the videos
C.
Use SQS to set priorities on individual items within a single queue and use Amazon SNS to encode the videos
Answers
D.
Create two Amazon SNS topics: one for paying members and one for non-paying members Use SNS topic subscription priorities to differentiate between the two types of members.
D.
Create two Amazon SNS topics: one for paying members and one for non-paying members Use SNS topic subscription priorities to differentiate between the two types of members.
Answers
Suggested answer: B

A company has a web application In an Amazon Elastic Container Service (Amazon ECS) cluster running hundreds of secure services in AWS Fargate containers. The services are in target groups routed by an Application Load Balancer (ALB) Application users log in to the website anonymously, but they must be authenticated using any OpenID Connect protocol-compatible identity provider (IdP) to access the secure services Which authentication approach would meet these requirements with the LEAST amount of effort?

A.
Configure the services to use Amazon Cognito.
A.
Configure the services to use Amazon Cognito.
Answers
B.
Configure the ALB to use Amazon Cognito
B.
Configure the ALB to use Amazon Cognito
Answers
C.
Configure the services to use AWS Security Token Service (AWS STS) with the OpenID Connect IdP.
C.
Configure the services to use AWS Security Token Service (AWS STS) with the OpenID Connect IdP.
Answers
D.
Configure the Amazon ECS cluster to use AWS Security Token Service (AWS STS) with the OpenID Connect IdP
D.
Configure the Amazon ECS cluster to use AWS Security Token Service (AWS STS) with the OpenID Connect IdP
Answers
Suggested answer: A

A developer from AnyCompany's AWS account needs access to the Example Corp AWS account AnyCompany uses an identity provider that is compatible with OpenID Connect. What is the MOST secure way for Example Corp to allow developer access?

A.
Create a cross-account role and call the AssumeRole API operation
A.
Create a cross-account role and call the AssumeRole API operation
Answers
B.
Create a user in the Example Corp account and provide the access keys
B.
Create a user in the Example Corp account and provide the access keys
Answers
C.
Create a user in the Example Corp account and provide the credentials
C.
Create a user in the Example Corp account and provide the credentials
Answers
D.
Create a cross-account role and call the AssumeRoleWithWebldentity API operation
D.
Create a cross-account role and call the AssumeRoleWithWebldentity API operation
Answers
Suggested answer: B

A company is developing a new web application in Python A developer must deploy the application using AWS Elastic Beanstalk from the AWS Management Console The developer creates an Elastic Beanstalk source bundle to upload using the console Which of the following are requirements when creating the source bundle? (Select TWO.)

A.
The source bundle must include the ebextensions.yaml file.
A.
The source bundle must include the ebextensions.yaml file.
Answers
B.
The source bundle must not include a top-level directory.
B.
The source bundle must not include a top-level directory.
Answers
C.
The source bundle must be compressed with any required dependencies in a top-level parent folder
C.
The source bundle must be compressed with any required dependencies in a top-level parent folder
Answers
D.
The source bundle must be created as a single zip or war file
D.
The source bundle must be created as a single zip or war file
Answers
E.
The source bundle must be uploaded into Amazon EFS.
E.
The source bundle must be uploaded into Amazon EFS.
Answers
Suggested answer: B, D
Total 608 questions
Go to page: of 61