ExamGecko

DVA-C01: AWS Certified Developer - Associate

AWS Certified Developer - Associate
Vendor:

Amazon

AWS Certified Developer - Associate Exam Questions: 608
AWS Certified Developer - Associate   2.370 Learners
Take Practice Tests
Comming soon
PDF | VPLUS

Exam Number: DVA-C01

Exam Name: AWS Certified Developer - Associate

Length of test: 130 mins

Exam Format: Multiple-choice questions.

Language Offered: English, French, German, Italian, Japanese, Korean, Portuguese, Simplified Chinese, and Spanish

Number of questions in the actual exam: 65 questions

Passing Score: 720 points (approximately 48 out of 65 questions)

This certification is designed for individuals who develop and maintain applications on AWS. It validates your ability to demonstrate proficiency in developing, testing, deploying, and debugging AWS cloud-based applications.

Related questions

A company wants to implement a continuous integration for its workloads on AWS. The company wants to trigger unit test in its pipeline for commits-on its code repository, and wants to be notified of failure events in the pipeline. How can these requirements be met?

A.
Store the source code in AWS CodeCommit. Create a CodePipeline to automate unit testing. Use Amazon SNS to trigger notifications of failure events.
A.
Store the source code in AWS CodeCommit. Create a CodePipeline to automate unit testing. Use Amazon SNS to trigger notifications of failure events.
Answers
B.
Store the source code in GitHub. Create a CodePipeline to automate unit testing. Use Amazon SES to trigger notifications of failure events.
B.
Store the source code in GitHub. Create a CodePipeline to automate unit testing. Use Amazon SES to trigger notifications of failure events.
Answers
C.
Store the source code on GitHub. Create a CodePipeline to automate unit testing. Use Amazon CloudWatch to trigger notifications of failure events.
C.
Store the source code on GitHub. Create a CodePipeline to automate unit testing. Use Amazon CloudWatch to trigger notifications of failure events.
Answers
D.
Store the source code in AWS CodeCommit. Create a CodePipeline to automate unit testing. Use Amazon CloudWatch to trigger notification of failure events.
D.
Store the source code in AWS CodeCommit. Create a CodePipeline to automate unit testing. Use Amazon CloudWatch to trigger notification of failure events.
Answers
Suggested answer: D
asked 16/09/2024
DAVID LOPEZ MORGADO
41 questions

An application stores payroll information nightly in DynamoDB for a large number of employees across hundreds of offices. Item attributes consist of individual name, office identifier, and cumulative daily hours. Managers run reports for ranges of names working in their office. One query is. "Return all Items in this office for names starting with A through E". Which table configuration will result in the lowest impact on provisioned throughput for this query?

A.
Configure the table to have a hash index on the name attribute, and a range index on the office identifier
A.
Configure the table to have a hash index on the name attribute, and a range index on the office identifier
Answers
B.
Configure the table to have a range index on the name attribute, and a hash index on the office identifier
B.
Configure the table to have a range index on the name attribute, and a hash index on the office identifier
Answers
C.
Configure a hash index on the name attribute and no range index
C.
Configure a hash index on the name attribute and no range index
Answers
D.
Configure a hash index on the office Identifier attribute and no range index
D.
Configure a hash index on the office Identifier attribute and no range index
Answers
Suggested answer: B

Explanation:

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.CoreComponents.htmlPartition key and sort key – Referred to as a composite primary key, this type of key is composed oftwo attributes. The first attribute is the partition key, and the second attribute is the sort key.

DynamoDB uses the partition key value as input to an internal hash function. The output from the hash function determines the partition (physical storage internal to DynamoDB) in which the item will be stored. All items with the same partition key value are stored together, in sorted order by sort key value.

asked 16/09/2024
FOTIS FOURLIAS
47 questions

A company has an application that uses Amazon Cognito user pools as an identity provider. The company must secure access to user records. The company I up multi-factor authentication (MFA). The company also wants to send a login activity notification by email every time a user logs in.

What is the MOST operationally efficient solution that meets this requirement?

Become a Premium Member for full access
Unlock Premium Member  Unlock Premium Member

A development team consists of 10 team members. Similar to a home directory for each team member the manager wants to grant access to user-specific folders in an Amazon S3 bucket. For the team member with the username “TeamMemberX”, the snippet of the IAM policy looks like this:

Instead of creating distinct policies for each team member, what approach can be used to make this policy snippet generic for all team members?

A.
Use IAM policy condition
A.
Use IAM policy condition
Answers
B.
Use IAM policy principal
B.
Use IAM policy principal
Answers
C.
Use IAM policy variables
C.
Use IAM policy variables
Answers
D.
Use IAM policy resource
D.
Use IAM policy resource
Answers
Suggested answer: C

Explanation:

> https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_variables.html > UseAWS Identity and Access Management (IAM) policy variables as placeholders when you don't knowthe exact value of a resource or condition key when you write the policy.

asked 16/09/2024
Calin-Alin Stoenescu
41 questions

During non-peak hours, a Developer wants to minimize the execution time of a full Amazon DynamoDB table scan without affecting normal workloads. The workloads average half of the strongly consistent read capacity units during non-peak hours.

How would the Developer optimize this scan?

A.
Use parallel scans while limiting the rate
A.
Use parallel scans while limiting the rate
Answers
B.
Use sequential scans
B.
Use sequential scans
Answers
C.
Increase read capacity units during the scan operation
C.
Increase read capacity units during the scan operation
Answers
D.
Change consistency to eventually consistent during the scan operation
D.
Change consistency to eventually consistent during the scan operation
Answers
Suggested answer: A

Explanation:

https://aws.amazon.com/blogs/developer/rate-limited-scans-in-amazon-dynamodb/

asked 16/09/2024
Reatlehile Motaung
25 questions

Which of the following are correct statements with policy evaluation logic in AWS Identity and Access Management? Choose 2 answers

A.
By default, all requests are denied
A.
By default, all requests are denied
Answers
B.
An explicit allow overrides an explicit deny
B.
An explicit allow overrides an explicit deny
Answers
C.
An explicit allow overrides default deny.
C.
An explicit allow overrides default deny.
Answers
D.
An explicit deny does not override an explicit allow
D.
An explicit deny does not override an explicit allow
Answers
E.
By default, all request are allowed
E.
By default, all request are allowed
Answers
Suggested answer: A, C

Explanation:

https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_evaluation-logic.htmlBy default, all requests are implicitly denied. (Alternatively, by default, the AWS account root user has full access.) An explicit allow in an identity- based or resource-based policy overrides this default.

If a permissions boundary, Organizations SCP, or session policy is present, it might override the allow with an implicit deny. An explicit deny in any policy overrides any allows.

asked 16/09/2024
Flamur Kapaj
44 questions

You are writing to a DynamoDB table and receive the following exception:" ProvisionedThroughputExceededException". though according to your Cloudwatch metrics for the table, you are not exceeding your provisioned throughput. What could be an explanation for this?

A.
You haven't provisioned enough DynamoDB storage instances
A.
You haven't provisioned enough DynamoDB storage instances
Answers
B.
You're exceeding your capacity on a particular Range Key
B.
You're exceeding your capacity on a particular Range Key
Answers
C.
You're exceeding your capacity on a particular Hash Key
C.
You're exceeding your capacity on a particular Hash Key
Answers
D.
You're exceeding your capacity on a particular Sort Key
D.
You're exceeding your capacity on a particular Sort Key
Answers
E.
E.
Answers
Suggested answer: C

Explanation:

A. You haven't provisioned enough DynamoDB storage instances

B. You're exceeding your capacity on a particular Range Key

C. You're exceeding your capacity on a particular Hash Key

D. You're exceeding your capacity on a particular Sort Key

E. You haven't configured DynamoDB Auto Scaling triggers

Answer: C

Explanation:

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.CoreComponents.html#HowItWorks.CoreComponents.PrimaryKey https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.Partitions.html

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/bp-partition-keydesign.html

asked 16/09/2024
Marcos Davila
32 questions

A Developer is investigating an issue whereby certain requests are passing through an Amazon API Gateway endpoint /MyAPI, but the requests do not reach the AWS Lambda function backing /MyAPI. The Developer found that a second Lambda function sometimes runs at maximum concurrency allowed for the given AWS account. How can the Developer address this issue?

A.
Manually reduce the concurrent execution limit at the account level
A.
Manually reduce the concurrent execution limit at the account level
Answers
B.
Add another API Gateway stage for /MyAPI, and shard the requests
B.
Add another API Gateway stage for /MyAPI, and shard the requests
Answers
C.
Configure the second Lambda function’s concurrency execution limit
C.
Configure the second Lambda function’s concurrency execution limit
Answers
D.
Reduce the throttling limits in the API Gateway /MyAPI endpoint
D.
Reduce the throttling limits in the API Gateway /MyAPI endpoint
Answers
Suggested answer: C

Explanation:

https://aws.amazon.com/about-aws/whats-new/2017/11/set-concurrency-limits-on-individual-awslambda-functions/You can now set a concurrency limit on individual AWS Lambda functions. The concurrency limit youset will reserve a portion of your account level concurrency limit for a given function. This featureallows you to throttle a given function if it reaches a maximum number of concurrent executionsallowed, which you can choose to set.

asked 16/09/2024
Vladimir Litvinenko
29 questions

A developer runs an application that uses an Amazon API Gateway REST API. The developer needs to implement a solution to proactively monitor the health of both API responses and latencies in case a deployment causes a service disruption despite passing deployment pipeline tests. The solution also must check for endpoint vulnerability and unauthorized changes to APIs. URLs, and website content. Which solution will meet these requirements?

Become a Premium Member for full access
Unlock Premium Member  Unlock Premium Member

A developer has written the following 1AM policy to provide access to an Amazon S3 bucket:

Which access does the policy allow regarding the s3:GetObject and s3:PutObject actions'?

Become a Premium Member for full access
Unlock Premium Member  Unlock Premium Member