ExamGecko
Home Home / Amazon / DVA-C01

Amazon DVA-C01 Practice Test - Questions Answers, Page 6

Question list
Search
Search

List of questions

Search

Related questions











What item operation allows the retrieval of multiple items from a DynamoDB table in a single API call?

A.
GetItem
A.
GetItem
Answers
B.
BatchGetItem
B.
BatchGetItem
Answers
C.
GetMultipleItems
C.
GetMultipleItems
Answers
D.
GetItemRange
D.
GetItemRange
Answers
Suggested answer: B

Explanation:

https://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_BatchGetItem.html

After launching an instance that you intend to serve as a NAT (Network Address Translation) device in a public subnet you modify your route tables to have the NAT device be the target of internet bound traffic of your private subnet. When you try and make an outbound connection to the Internet from an instance in the private subnet, you are not successful. Which of the following steps could resolve the issue?

A.
Attaching a second Elastic Network interface (ENI) to the NAT instance, and placing it in the private subnet
A.
Attaching a second Elastic Network interface (ENI) to the NAT instance, and placing it in the private subnet
Answers
B.
Attaching a second Elastic Network Interface (ENI) to the instance in the private subnet, and placing it in the public subnet
B.
Attaching a second Elastic Network Interface (ENI) to the instance in the private subnet, and placing it in the public subnet
Answers
C.
Disabling the Source/Destination Check attribute on the NAT instance
C.
Disabling the Source/Destination Check attribute on the NAT instance
Answers
D.
Attaching an Elastic IP address to the instance in the private subnet
D.
Attaching an Elastic IP address to the instance in the private subnet
Answers
Suggested answer: C

Explanation:

https://docs.aws.amazon.com/vpc/latest/userguide/VPC_NAT_Instance.html#NATInstance

You attempt to store an object in the US-STANDARD region in Amazon S3, and receive a confirmation that it has been successfully stored. You then immediately make another API call and attempt to read this object. S3 tells you that the object does not exist What could explain this behavior?

A.
US-STANDARD uses eventual consistency and it can take time for an object to be readable in a bucket
A.
US-STANDARD uses eventual consistency and it can take time for an object to be readable in a bucket
Answers
B.
Objects in Amazon S3 do not become visible until they are replicated to a second region.
B.
Objects in Amazon S3 do not become visible until they are replicated to a second region.
Answers
C.
US-STANDARD imposes a 1 second delay before new objects are readable.
C.
US-STANDARD imposes a 1 second delay before new objects are readable.
Answers
D.
You exceeded the bucket object limit, and once this limit is raised the object will be visible.
D.
You exceeded the bucket object limit, and once this limit is raised the object will be visible.
Answers
Suggested answer: A

Explanation:

https://acloud.guru/forums/aws-certified-developer-associate/discussion/-KGngHzVQ03OpeAA9jSP/i-cant-answer-a-sample-question-pretty-worried-about-the-real-thing https://acloud.guru/forums/aws-certified-developer-associate/discussion/-K5WKXRAlJdOu58GREF_/s3-question

What is the maximum number of S3 Buckets available per AWS account?

A.
100 per region
A.
100 per region
Answers
B.
there is no limit
B.
there is no limit
Answers
C.
100 per account
C.
100 per account
Answers
D.
500 per account
D.
500 per account
Answers
E.
100 per IAM user
E.
100 per IAM user
Answers
Suggested answer: C

Explanation:

https://docs.aws.amazon.com/AmazonS3/latest/dev/BucketRestrictions.html

Which of the following items are required to allow an application deployed on an EC2 instance to write data to a DynamoDB table? Assume that no security Keys are allowed to be stored on the EC2 instance. Choose 2 answers

A.
Create an IAM User that allows write access to the DynamoDB table.
A.
Create an IAM User that allows write access to the DynamoDB table.
Answers
B.
Add an IAM Role to a running EC2 instance.
B.
Add an IAM Role to a running EC2 instance.
Answers
C.
Add an IAM User to a running EC2 Instance.
C.
Add an IAM User to a running EC2 Instance.
Answers
D.
Launch an EC2 Instance with the IAM Role included in the launch configuration.
D.
Launch an EC2 Instance with the IAM Role included in the launch configuration.
Answers
E.
Create an IAM Role that allows write access to the DynamoDB table.
E.
Create an IAM Role that allows write access to the DynamoDB table.
Answers
F.
Launch an EC2 Instance with the IAM User included in the launch configuration.
F.
Launch an EC2 Instance with the IAM User included in the launch configuration.
Answers
Suggested answer: B, E

Explanation:

https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html#attachiam-role

A Developer created a dashboard for an application using Amazon API Gateway, Amazon S3, AWS Lambda, and Amazon RDS. The Developer needs an authentication mechanism allowing a user to sign in and view the dashboard. It must be accessible from mobile applications, desktops, and tablets, and must remember user preferences across platforms. Which AWS service should the Developer use to support this authentication scenario?

A.
AWS KMS
A.
AWS KMS
Answers
B.
Amazon Cognito
B.
Amazon Cognito
Answers
C.
AWS Directory Service
C.
AWS Directory Service
Answers
D.
Amazon IAM
D.
Amazon IAM
Answers
Suggested answer: B

Explanation:

Congito user pool provides sign up and sign in functionality along with identity pool which provides temp credentials for using aws services.

A Developer has created an S3 bucket s3://mycoolapp and has enabled server across logging that points to the folder s3://mycoolapp/logs. The Developer moved 100 KB of Cascading Style Sheets (CSS) documents to the folder s3://mycoolapp/css, and then stopped work. When the developer came back a few days later, the bucket was 50 GB. What is the MOST likely cause of this situation?

A.
The CSS files were not compressed and S3 versioning was enabled.
A.
The CSS files were not compressed and S3 versioning was enabled.
Answers
B.
S3 replication was enabled on the bucket.
B.
S3 replication was enabled on the bucket.
Answers
C.
Logging into the same bucket caused exponential log growth.
C.
Logging into the same bucket caused exponential log growth.
Answers
D.
An S3 lifecycle policy has moved the entire CSS file to S3 Infrequent Access.
D.
An S3 lifecycle policy has moved the entire CSS file to S3 Infrequent Access.
Answers
Suggested answer: C

Explanation:

Refer AWS documentation - S3 Server logs

To turn on log delivery, you provide the following logging configuration information:

The name of the target bucket where you want Amazon S3 to save the access logs as objects. You can have logs delivered to any bucket that you own that is in the same Region as the source bucket, including the source bucket itself.We recommend that you save access logs in a different bucket so that you can easily manage the logs. If you choose to save access logs in the source bucket, we recommend that you specify a prefix for all log object keys so that the object names begin with a common string and the log objects are easier to identify.When your source bucket and target bucket are the same bucket, additional logs are created for the logs that are written to the bucket. This behavior might not be ideal for your use case because it could result in a small increase in your storage billing. In addition, the extra logs about logs might make it harder to find the log that you're looking for.

A Developer is creating an Auto Scaling group whose instances need to publish a custom metric to Amazon CloudWatch. Which method would be the MOST secure way to authenticate a CloudWatch PUT request?

A.
Create an IAM user with PutMetricData permission and put the user credentials in a private repository; have applications pull the credentials as needed.
A.
Create an IAM user with PutMetricData permission and put the user credentials in a private repository; have applications pull the credentials as needed.
Answers
B.
Create an IAM user with PutMetricData permission, and modify the Auto Scaling launch configuration to inject the user credentials into the instance user data.
B.
Create an IAM user with PutMetricData permission, and modify the Auto Scaling launch configuration to inject the user credentials into the instance user data.
Answers
C.
Modify the CloudWatch metric policies to allow the PutMetricData permission to instances from the Auto Scaling group.
C.
Modify the CloudWatch metric policies to allow the PutMetricData permission to instances from the Auto Scaling group.
Answers
D.
Create an IAM role with PutMetricData permission and modify the Auto Scaling launching configuration to launch instances using that role.
D.
Create an IAM role with PutMetricData permission and modify the Auto Scaling launching configuration to launch instances using that role.
Answers
Suggested answer: D

A Developer is working on an application that tracks hundreds of millions of product reviews in an Amazon DynamoDB table. The records include the data elements shown in the table:

Which field, when used as the partition key, would result in the MOST consistent performance using DynamoDB?

A.
starRating
A.
starRating
Answers
B.
reviewID
B.
reviewID
Answers
C.
comment
C.
comment
Answers
D.
productID
D.
productID
Answers
Suggested answer: B

A Developer has written a serverless application using multiple AWS services. The business logic is written as a Lambda function which has dependencies on third-party libraries. The Lambda function endpoints will be exposed using Amazon API Gateway. The Lambda function will write the information to Amazon DynamoDB.

The Developer is ready to deploy the application but must have the ability to rollback. How can this deployment be automated, based on these requirements?

A.
Deploy using Amazon Lambda API operations to create the Lambda function by providing a deployment package.
A.
Deploy using Amazon Lambda API operations to create the Lambda function by providing a deployment package.
Answers
B.
Use an AWS CloudFormation template and use CloudFormation syntax to define the Lambda function resource in the template.
B.
Use an AWS CloudFormation template and use CloudFormation syntax to define the Lambda function resource in the template.
Answers
C.
Use syntax conforming to the Serverless Application Model in the AWS CloudFormation template to define the Lambda function resource.
C.
Use syntax conforming to the Serverless Application Model in the AWS CloudFormation template to define the Lambda function resource.
Answers
D.
Create a bash script which uses AWS CLI to package and deploy the application.
D.
Create a bash script which uses AWS CLI to package and deploy the application.
Answers
Suggested answer: C

Explanation:

Refer AWS documentation - SAM Gradual Code Deployment

If you use AWS SAM to create your serverless application, it comes built-in with AWS CodeDeploy to help ensure safe Lambda deployments. With just a few lines of configuration, AWS SAM does the following for you:

Deploys new versions of your Lambda function, and automatically creates aliases that point to the new version. Gradually shifts customer traffic to the new version until you're satisfied that it's working as expected, or you roll back the update. Defines pre-traffic and post-traffic test functions to verify that the newly deployed code is configured correctly and your application operates as expected. Rolls back the deployment if CloudWatch alarms are triggered.

Total 608 questions
Go to page: of 61