ExamGecko
Home / Amazon / DVA-C01 / List of questions
Ask Question

Amazon DVA-C01 Practice Test - Questions Answers, Page 40

List of questions

Question 391

Report
Export
Collapse

A company is launching a poling application. The application will store the results of each pool an Amazon DynamoDB table. Management wants to remove pool data after a few data and store an archive of those records in Amazon S3. Which approach would allow the application to archive each poll's data while keeping complexity to a MINIMUM?

Enable Time to Live (TTL) on the DynamoDB table. Enable DynamoDB Streams on the table and store the records removed from the stream in Amazon S3.
Enable Time to Live (TTL) on the DynamoDB table. Enable DynamoDB Streams on the table and store the records removed from the stream in Amazon S3.
Schedule an AWS Lambda function to periodically scan the DynamoDB table. Use the BatchWritten operation to delete the results of a scan Enable DynamoDB Stream on the table and store the records removed from the stream in Amazon S3.
Schedule an AWS Lambda function to periodically scan the DynamoDB table. Use the BatchWritten operation to delete the results of a scan Enable DynamoDB Stream on the table and store the records removed from the stream in Amazon S3.
Enable DynamoDB Streams on the table. Configure the steam as trigger for AWS Lambda. Save records to Amazon S3 when records on the stream are modified.
Enable DynamoDB Streams on the table. Configure the steam as trigger for AWS Lambda. Save records to Amazon S3 when records on the stream are modified.
Enable cross-Region replication on the S3 bucket to achieve the poll data.
Enable cross-Region replication on the S3 bucket to achieve the poll data.
Suggested answer: C
asked 16/09/2024
Jose Leston
41 questions

Question 392

Report
Export
Collapse

A developer is designing a distributed application built using a microservices architect spanning multiple AWS accounts. The company's operations team wants to analyze and debug application issues from a centralized account. How can the developer meet these requirements?

Use an Amazon X-Ray agent with role assumption on to publish data into the centralized account.
Use an Amazon X-Ray agent with role assumption on to publish data into the centralized account.
Use Amazon X-Ray and create a new IAM user to publish the access keys into the centralized account.
Use Amazon X-Ray and create a new IAM user to publish the access keys into the centralized account.
Use VPC Flow Logs to collect application logs across different accounts.
Use VPC Flow Logs to collect application logs across different accounts.
Enable AWS CloudTrail to publish the trails in an Amazon S3 bucket in the centralized account.
Enable AWS CloudTrail to publish the trails in an Amazon S3 bucket in the centralized account.
Suggested answer: A
asked 16/09/2024
Junwei Li
41 questions

Question 393

Report
Export
Collapse

What is required to trace Lambda-based applications with AWS X-Ray?

Send logs from the Lambda application to an S3 bucket trigger a Lambda function from that bucket to send data to AWS X-Ray.
Send logs from the Lambda application to an S3 bucket trigger a Lambda function from that bucket to send data to AWS X-Ray.
Trigger a Lambda function from the application logs in Amazon CloudWatch to submit tracing data to AWS X-Ray
Trigger a Lambda function from the application logs in Amazon CloudWatch to submit tracing data to AWS X-Ray
Use an IAM execution role to give the Lambda function permissions and enabled tracing.
Use an IAM execution role to give the Lambda function permissions and enabled tracing.
Update and add AWS X-ray daemon code to relevant parts of the Lambda function to set up the trace.
Update and add AWS X-ray daemon code to relevant parts of the Lambda function to set up the trace.
Suggested answer: D
asked 16/09/2024
Juan Carlos Delgado
37 questions

Question 394

Report
Export
Collapse

A developer is creating an application to process a large number of requests Requests must be processed in order, and each request should be processed only once How should Amazon SQS be deployed to achieve this?

Configure First in First out (FIFO) delivery in a standard Amazon SQS queue to process requests.
Configure First in First out (FIFO) delivery in a standard Amazon SQS queue to process requests.
Use an SQS FIFO queue to process requests
Use an SQS FIFO queue to process requests
Use the SetOrder attribute to ensure sequential request processing
Use the SetOrder attribute to ensure sequential request processing
Convert the standard queue to a FIFO queue by renaming the queue to use the fifo suffix.
Convert the standard queue to a FIFO queue by renaming the queue to use the fifo suffix.
Suggested answer: B
asked 16/09/2024
Vinnie Meuldijk
32 questions

Question 395

Report
Export
Collapse


A developer must modify an Alexa skill backed by an AWS Lambda function to access an Amazon DynamoDB table in a second account A role in the second account has been created with permissions to access the table How should the table be accessed?

Modify the Lambda function execution role's permissions to include the new role
Modify the Lambda function execution role's permissions to include the new role
Change the Lambda function execution role to be the new role
Change the Lambda function execution role to be the new role
Assume the new role in the Lambda function when accessing the table
Assume the new role in the Lambda function when accessing the table
Store the access key and the secret key for the new role and use them when accessing the table
Store the access key and the secret key for the new role and use them when accessing the table
Suggested answer: A
asked 16/09/2024
Konstantinos Lagoudakis
28 questions

Question 396

Report
Export
Collapse

A developer Is designing an AWS Lambda function that create temporary files that are less than 10 MB during execution. The temporary files will be accessed and modified multiple times during execution. The developer has no need to save or retrieve these files in the future.

Where should the temporary file be stored?

the /tmp directory
the /tmp directory
Amazon EFS
Amazon EFS
Amazon EBS
Amazon EBS
Amazon S3
Amazon S3
Suggested answer: A
asked 16/09/2024
Daniel Silva
42 questions

Question 397

Report
Export
Collapse

A video-hosting website has two types of members: those who pay a fee. and those who do not Each video upload places a message in Amazon SQS A fleet of Amazon EC2 instances polls Amazon SQS and processes each video The developer needs to ensure that the videos uploaded by the paying members are processed first How can the developer meet this requirement?

Create two SQS queues: one for paying members, and one for non-paying members Poll the paying member queue first and then poll the non-paying member queue
Create two SQS queues: one for paying members, and one for non-paying members Poll the paying member queue first and then poll the non-paying member queue
Use SQS to set priorities on individual items within a single queue: give the paying members' videos the highest priority.
Use SQS to set priorities on individual items within a single queue: give the paying members' videos the highest priority.
Use SQS to set priorities on individual items within a single queue and use Amazon SNS to encode the videos
Use SQS to set priorities on individual items within a single queue and use Amazon SNS to encode the videos
Create two Amazon SNS topics: one for paying members and one for non-paying members Use SNS topic subscription priorities to differentiate between the two types of members.
Create two Amazon SNS topics: one for paying members and one for non-paying members Use SNS topic subscription priorities to differentiate between the two types of members.
Suggested answer: B
asked 16/09/2024
Phuong Pham
40 questions

Question 398

Report
Export
Collapse

A company has a web application In an Amazon Elastic Container Service (Amazon ECS) cluster running hundreds of secure services in AWS Fargate containers. The services are in target groups routed by an Application Load Balancer (ALB) Application users log in to the website anonymously, but they must be authenticated using any OpenID Connect protocol-compatible identity provider (IdP) to access the secure services Which authentication approach would meet these requirements with the LEAST amount of effort?

Configure the services to use Amazon Cognito.
Configure the services to use Amazon Cognito.
Configure the ALB to use Amazon Cognito
Configure the ALB to use Amazon Cognito
Configure the services to use AWS Security Token Service (AWS STS) with the OpenID Connect IdP.
Configure the services to use AWS Security Token Service (AWS STS) with the OpenID Connect IdP.
Configure the Amazon ECS cluster to use AWS Security Token Service (AWS STS) with the OpenID Connect IdP
Configure the Amazon ECS cluster to use AWS Security Token Service (AWS STS) with the OpenID Connect IdP
Suggested answer: A
asked 16/09/2024
Ellee Chen
40 questions

Question 399

Report
Export
Collapse

A developer from AnyCompany's AWS account needs access to the Example Corp AWS account AnyCompany uses an identity provider that is compatible with OpenID Connect. What is the MOST secure way for Example Corp to allow developer access?

Create a cross-account role and call the AssumeRole API operation
Create a cross-account role and call the AssumeRole API operation
Create a user in the Example Corp account and provide the access keys
Create a user in the Example Corp account and provide the access keys
Create a user in the Example Corp account and provide the credentials
Create a user in the Example Corp account and provide the credentials
Create a cross-account role and call the AssumeRoleWithWebldentity API operation
Create a cross-account role and call the AssumeRoleWithWebldentity API operation
Suggested answer: B
asked 16/09/2024
Tiziano Riezzo
47 questions

Question 400

Report
Export
Collapse

A company is developing a new web application in Python A developer must deploy the application using AWS Elastic Beanstalk from the AWS Management Console The developer creates an Elastic Beanstalk source bundle to upload using the console Which of the following are requirements when creating the source bundle? (Select TWO.)

The source bundle must include the ebextensions.yaml file.
The source bundle must include the ebextensions.yaml file.
The source bundle must not include a top-level directory.
The source bundle must not include a top-level directory.
The source bundle must be compressed with any required dependencies in a top-level parent folder
The source bundle must be compressed with any required dependencies in a top-level parent folder
The source bundle must be created as a single zip or war file
The source bundle must be created as a single zip or war file
The source bundle must be uploaded into Amazon EFS.
The source bundle must be uploaded into Amazon EFS.
Suggested answer: B, D
asked 16/09/2024
Andreas Krieger
34 questions
Total 608 questions
Go to page: of 61
Search

Related questions