ExamGecko
Home Home / Amazon / DVA-C02

Amazon DVA-C02 Practice Test - Questions Answers, Page 9

Question list
Search
Search

List of questions

Search

Related questions











A company is building a micro services app1 cation that consists of many AWS Lambda functions. The development team wants to use AWS Serverless Application Model (AWS SAM) templates to automatically test the Lambda functions. The development team plans to test a small percentage of traffic that is directed to new updates before the team commits to a full deployment of the application.

Which combination of steps will meet these requirements in the MOST operationally efficient way?

(Select TWO.)

A.
Use AWS SAM CLI commands in AWS CodeDeploy lo invoke the Lambda functions lo lest the deployment
A.
Use AWS SAM CLI commands in AWS CodeDeploy lo invoke the Lambda functions lo lest the deployment
Answers
B.
Declare the EventlnvokeConfig on the Lambda functions in the AWS SAM templates with OnSuccess and OnFailure configurations.
B.
Declare the EventlnvokeConfig on the Lambda functions in the AWS SAM templates with OnSuccess and OnFailure configurations.
Answers
C.
Enable gradual deployments through AWS SAM templates.
C.
Enable gradual deployments through AWS SAM templates.
Answers
D.
Set the deployment preference type to Canary10Percen130Minutes Use hooks to test the deployment.
D.
Set the deployment preference type to Canary10Percen130Minutes Use hooks to test the deployment.
Answers
E.
Set the deployment preference type to Linear10PefcentEvery10Minutes Use hooks to test the deployment.
E.
Set the deployment preference type to Linear10PefcentEvery10Minutes Use hooks to test the deployment.
Answers
Suggested answer: C, D

Explanation:

This solution will meet the requirements by using AWS Serverless Application Model (AWS SAM) templates and gradual deployments to automatically test the Lambda functions. AWS SAM templates are configuration files that define serverless applications and resources such as Lambda functions. Gradual deployments are a feature of AWS SAM that enable deploying new versions of Lambda functions incrementally, shifting traffic gradually, and performing validation tests during deployment. The developer can enable gradual deployments through AWS SAM templates by adding a DeploymentPreference property to each Lambda function resource in the template. The developer can set the deployment preference type to Canary10Percent30Minutes, which means that 10 percent of traffic will be shifted to the new version of the Lambda function for 30 minutes before shifting 100 percent of traffic. The developer can also use hooks to test the deployment, which are custom Lambda functions that run before or after traffic shifting and perform validation tests or rollback actions.

Reference: [AWS Serverless Application Model (AWS SAM)], [Gradual Code Deployment]

A company is using AWS CioudFormation to deploy a two-tier application. The application will use Amazon RDS as its backend database. The company wants a solution that will randomly generate the database password during deployment. The solution also must automatically rotate the database password without requiring changes to the application.

What is the MOST operationally efficient solution that meets these requirements'?

A.
Use an AWS Lambda function as a CloudFormation custom resource to generate and rotate the password.
A.
Use an AWS Lambda function as a CloudFormation custom resource to generate and rotate the password.
Answers
B.
Use an AWS Systems Manager Parameter Store resource with the SecureString data type to generate and rotate the password.
B.
Use an AWS Systems Manager Parameter Store resource with the SecureString data type to generate and rotate the password.
Answers
C.
Use a cron daemon on the application s host to generate and rotate the password.
C.
Use a cron daemon on the application s host to generate and rotate the password.
Answers
D.
Use an AWS Secrets Manager resource to generate and rotate the password.
D.
Use an AWS Secrets Manager resource to generate and rotate the password.
Answers
Suggested answer: D

Explanation:

This solution will meet the requirements by using AWS Secrets Manager, which is a service that helps protect secrets such as database credentials by encrypting them with AWS Key Management Service (AWS KMS) and enabling automatic rotation of secrets. The developer can use an AWS Secrets Manager resource in AWS CloudFormation template, which enables creating and managing secrets as part of a CloudFormation stack. The developer can use an AWS::SecretsManager::Secret resource type to generate and rotate the password for accessing RDS database during deployment.

The developer can also specify a RotationSchedule property for the secret resource, which defines how often to rotate the secret and which Lambda function to use for rotation logic. Option A is not optimal because it will use an AWS Lambda function as a CloudFormation custom resource, which may introduce additional complexity and overhead for creating and managing a custom resource and implementing rotation logic. Option B is not optimal because it will use an AWS Systems Manager Parameter Store resource with the SecureString data type, which does not support automatic rotation of secrets. Option C is not optimal because it will use a cron daemon on the application's host to generate and rotate the password, which may incur more costs and require more maintenance for running and securing a host.

Reference: [AWS Secrets Manager], [AWS::SecretsManager::Secret]

A company's website runs on an Amazon EC2 instance and uses Auto Scaling to scale the environment during peak times. Website users across the world ate experiencing high latency flue lo sialic content on theEC2 instance. even during non-peak hours.

When companion of steps mill resolves the latency issue? (Select TWO)

A.
Double the Auto Scaling group's maximum number of servers
A.
Double the Auto Scaling group's maximum number of servers
Answers
B.
Host the application code on AWS lambda
B.
Host the application code on AWS lambda
Answers
C.
Scale vertically by resizing the EC2 instances
C.
Scale vertically by resizing the EC2 instances
Answers
D.
Create an Amazon Cloudfront distribution to cache the static content
D.
Create an Amazon Cloudfront distribution to cache the static content
Answers
E.
Store the application's sialic content in Amazon S3
E.
Store the application's sialic content in Amazon S3
Answers
Suggested answer: D, E

Explanation:

The combination of steps that will resolve the latency issue is to create an Amazon CloudFront distribution to cache the static content and store the application's static content in Amazon S3. This way, the company can use CloudFront to deliver the static content from edge locations that are closer to the website users, reducing latency and improving performance. The company can also use S3 to store the static content reliably and cost-effectively, and integrate it with CloudFront easily. The other options either do not address the latency issue, or are not necessary or feasible for the given scenario.

Reference: Using Amazon S3 Origins and Custom Origins for Web Distributions

An online food company provides an Amazon API Gateway HTTP API 1o receive orders for partners.

The API is integrated with an AWS Lambda function. The Lambda function stores the orders in an Amazon DynamoDB table.

The company expects to onboard additional partners Some to me panthers require additional Lambda function to receive orders. The company has created an Amazon S3 bucket. The company needs 10 store all orders and updates m the S3 bucket for future analysis How can the developer ensure that an orders and updates are stored to Amazon S3 with the LEAST development effort?

A.
Create a new Lambda function and a new API Gateway API endpoint. Configure the new Lambda function to write to the S3 bucket. Modify the original Lambda function to post updates to the new API endpoint.
A.
Create a new Lambda function and a new API Gateway API endpoint. Configure the new Lambda function to write to the S3 bucket. Modify the original Lambda function to post updates to the new API endpoint.
Answers
B.
Use Amazon Kinesis Data Streams to create a new data stream. Modify the Lambda function to publish orders to the oats stream Configure in data stream to write to the S3 bucket.
B.
Use Amazon Kinesis Data Streams to create a new data stream. Modify the Lambda function to publish orders to the oats stream Configure in data stream to write to the S3 bucket.
Answers
C.
Enable DynamoDB Streams on me DynamoOB table. Create a new lambda function. Associate the stream's Amazon Resource Name (ARN) with the Lambda Function Configure the Lambda function to write to the S3 bucket as records appear in the table's stream.
C.
Enable DynamoDB Streams on me DynamoOB table. Create a new lambda function. Associate the stream's Amazon Resource Name (ARN) with the Lambda Function Configure the Lambda function to write to the S3 bucket as records appear in the table's stream.
Answers
D.
Modify the Lambda function to punish to a new Amazon. Simple Lambda function receives orders. Subscribe a new Lambda function to the topic. Configure the new Lambda function to write to the S3 bucket as updates come through the topic.
D.
Modify the Lambda function to punish to a new Amazon. Simple Lambda function receives orders. Subscribe a new Lambda function to the topic. Configure the new Lambda function to write to the S3 bucket as updates come through the topic.
Answers
Suggested answer: C

Explanation:

This solution will ensure that all orders and updates are stored to Amazon S3 with the least development effort because it uses DynamoDB Streams to capture changes in the DynamoDB table and trigger a Lambda function to write those changes to the S3 bucket. This way, the original Lambda function and API Gateway API endpoint do not need to be modified, and no additional services are required. Option A is not optimal because it will require more development effort to create a new Lambda function and a new API Gateway API endpoint, and to modify the original Lambda function to post updates to the new API endpoint. Option B is not optimal because it will introduce additional costs and complexity to use Amazon Kinesis Data Streams to create a new data stream, and to modify the Lambda function to publish orders to the data stream. Option D is not optimal because it will require more development effort to modify the Lambda function to publish to a new Amazon SNS topic, and to create and subscribe a new Lambda function to the topic.

Reference: Using DynamoDB Streams, Using AWS Lambda with Amazon S3

A company has an Amazon S3 bucket containing premier content that it intends to make available to only paid subscribers of its website. The S3 bucket currently has default permissions of all objects being private to prevent inadvertent exposure of the premier content to non-paying website visitors.

How can the company Limit the ability to download a premier content file in the S3 Bucket to paid subscribers only?

A.
Apply a bucket policy that allows anonymous users to download the content from the S3 bucket.
A.
Apply a bucket policy that allows anonymous users to download the content from the S3 bucket.
Answers
B.
Generate a pre-signed object URL for the premier content file when a pad subscriber requests a download.
B.
Generate a pre-signed object URL for the premier content file when a pad subscriber requests a download.
Answers
C.
Add a Docket policy that requires multi-factor authentication for request to access the S3 bucket objects.
C.
Add a Docket policy that requires multi-factor authentication for request to access the S3 bucket objects.
Answers
D.
Enable server-side encryption on the S3 bucket for data protection against the non-paying website visitors.
D.
Enable server-side encryption on the S3 bucket for data protection against the non-paying website visitors.
Answers
Suggested answer: B

Explanation:

This solution will limit the ability to download a premier content file in the S3 bucket to paid subscribers only because it uses a pre-signed object URL that grants temporary access to an S3 object for a specified duration. The pre-signed object URL can be generated by the company's website when a paid subscriber requests a download, and can be verified by Amazon S3 using the signature in the URL. Option A is not optimal because it will allow anyone to download the content from the S3 bucket without verifying their subscription status. Option C is not optimal because it will require additional steps and costs to configure multi-factor authentication for accessing the S3 bucket objects, which may not be feasible or user-friendly for paid subscribers. Option D is not optimal because it will not prevent non-paying website visitors from accessing the S3 bucket objects, but only encrypt them at rest.

Reference: Share an Object with Others, [Using Amazon S3 Pre-Signed URLs]

A developer is creating an AWS Lambda function that searches for Items from an Amazon DynamoDQ table that contains customer contact information. The DynamoDB table items have the customers as the partition and additional properties such as customer -type, name, and job_title.

The Lambda function runs whenever a user types a new character into the customer_type text Input. The developer wants to search to return partial matches of all tne email_address property of a particular customer type. The developer does not want to recreate the DynamoDB table.

What should the developer do to meet these requirements?

A.
Add a global secondary index (GSI) to the DynamoDB table with customer-type input, as the partition key and email_address as the sort key. Perform a query operation on the GSI by using the begins with key condition expression with the email_address property.
A.
Add a global secondary index (GSI) to the DynamoDB table with customer-type input, as the partition key and email_address as the sort key. Perform a query operation on the GSI by using the begins with key condition expression with the email_address property.
Answers
B.
Add a global secondary index (GSI) to the DynamoDB table with email_address as the partition key and customer_type as the sort key. Perform a query operation on the GSI by using the begine_with key condition expresses with the email. Address property.
B.
Add a global secondary index (GSI) to the DynamoDB table with email_address as the partition key and customer_type as the sort key. Perform a query operation on the GSI by using the begine_with key condition expresses with the email. Address property.
Answers
C.
Add a local secondary index (LSI) to the DynemoOB table with customer_type as the partition Key and email_address as the sort Key. Perform a quick operation on the LSI by using the begine_with Key condition expression with the email-address property.
C.
Add a local secondary index (LSI) to the DynemoOB table with customer_type as the partition Key and email_address as the sort Key. Perform a quick operation on the LSI by using the begine_with Key condition expression with the email-address property.
Answers
D.
Add a local secondary index (LSI) to the DynamoDB table with job-title as the partition key and email_address as the sort key. Perform a query operation on the LSI by using the begins_with key condition expression with the email_address property.
D.
Add a local secondary index (LSI) to the DynamoDB table with job-title as the partition key and email_address as the sort key. Perform a query operation on the LSI by using the begins_with key condition expression with the email_address property.
Answers
Suggested answer: A

Explanation:

The solution that will meet the requirements is to add a global secondary index (GSI) to the DynamoDB table with customer_type as the partition key and email_address as the sort key. Perform a query operation on the GSI by using the begins_with key condition expression with the email_address property. This way, the developer can search for partial matches of the email_address property of a particular customer type without recreating the DynamoDB table. The other options either involve using a local secondary index (LSI), which requires recreating the table, or using a different partition key, which does not allow filtering by customer_type.

Reference: Using Global Secondary Indexes in DynamoDB

A developer is building an application that uses AWS API Gateway APIs. AWS Lambda function, and AWS Dynamic DB tables. The developer uses the AWS Serverless Application Model (AWS SAM) to build and run serverless applications on AWS. Each time the developer pushes of changes for only to the Lambda functions, all the artifacts in the application are rebuilt.

The developer wants to implement AWS SAM Accelerate by running a command to only redeploy the Lambda functions that have changed.

Which command will meet these requirements?

A.
sam deploy -force-upload
A.
sam deploy -force-upload
Answers
B.
sam deploy -no-execute-changeset
B.
sam deploy -no-execute-changeset
Answers
C.
sam package
C.
sam package
Answers
D.
sam sync -watch
D.
sam sync -watch
Answers
Suggested answer: D

Explanation:

The command that will meet the requirements is sam sync -watch. This command enables AWS SAM Accelerate mode, which allows the developer to only redeploy the Lambda functions that have changed. The -watch flag enables file watching, which automatically detects changes in the source code and triggers a redeployment. The other commands either do not enable AWS SAM Accelerate mode, or do not redeploy the Lambda functions automatically.

Reference: AWS SAM Accelerate

A developer is building an application that gives users the ability to view bank account from multiple sources in a single dashboard. The developer has automated the process to retrieve API credentials for these sources. The process invokes an AWS Lambda function that is associated with an AWS CloudFormation cotton resource.

The developer wants a solution that will store the API credentials with minimal operational overhead.

When solution will meet these requirements?

A.
Add an AWS Secrets Manager GenerateSecretString resource to the CloudFormation template. Set the value to reference new credentials to the Cloudformation resource.
A.
Add an AWS Secrets Manager GenerateSecretString resource to the CloudFormation template. Set the value to reference new credentials to the Cloudformation resource.
Answers
B.
Use the AWS SDK ssm PutParameter operation in the Lambda function from the existing, custom resource to store the credentials as a parameter. Set the parameter value to reference the new credentials. Set ma parameter type to SecureString.
B.
Use the AWS SDK ssm PutParameter operation in the Lambda function from the existing, custom resource to store the credentials as a parameter. Set the parameter value to reference the new credentials. Set ma parameter type to SecureString.
Answers
C.
Add an AWS Systems Manager Parameter Store resource to the CloudFormation template. Set the CloudFormation resource value to reference the new credentials Set the resource NoEcho attribute to true.
C.
Add an AWS Systems Manager Parameter Store resource to the CloudFormation template. Set the CloudFormation resource value to reference the new credentials Set the resource NoEcho attribute to true.
Answers
D.
Use the AWS SDK ssm PutParameter operation in the Lambda function from the existing custom resources to store the credentials as a parameter. Set the parameter value to reference the new credentials. Set the parameter NoEcho attribute to true.
D.
Use the AWS SDK ssm PutParameter operation in the Lambda function from the existing custom resources to store the credentials as a parameter. Set the parameter value to reference the new credentials. Set the parameter NoEcho attribute to true.
Answers
Suggested answer: B

Explanation:

The solution that will meet the requirements is to use the AWS SDK ssm PutParameter operation in the Lambda function from the existing custom resource to store the credentials as a parameter. Set the parameter value to reference the new credentials. Set the parameter type to SecureString. This way, the developer can store the API credentials with minimal operational overhead, as AWS Systems Manager Parameter Store provides secure and scalable storage for configuration data. The SecureString parameter type encrypts the parameter value with AWS Key Management Service (AWS KMS). The other options either involve adding additional resources to the CloudFormation template, which increases complexity and cost, or do not encrypt the parameter value, which reduces security.

Reference: Creating Systems Manager parameters

A developer is configuring an applications deployment environment in AWS CodePipeine. The application code is stored in a GitHub repository. The developer wants to ensure that the repository package's unit tests run in the new deployment environment. The deployment has already set the pipeline's source provider to GitHub and has specified the repository and branch to use in the deployment.

When combination of steps should the developer take next to meet these requirements with the least the LEAST overhead' (Select TWO).

A.
Create an AWS CodeCommt project. Add the repository package's build and test commands to the protects buildspec
A.
Create an AWS CodeCommt project. Add the repository package's build and test commands to the protects buildspec
Answers
B.
Create an AWS CodeBuid project. Add the repository package's build and test commands to the projects buildspec
B.
Create an AWS CodeBuid project. Add the repository package's build and test commands to the projects buildspec
Answers
C.
Create an AWS CodeDeploy protect. Add the repository package's build and test commands to the project's buildspec
C.
Create an AWS CodeDeploy protect. Add the repository package's build and test commands to the project's buildspec
Answers
D.
Add an action to the source stage. Specify the newly created project as the action provider. Specify the build attract as the actions input artifact.
D.
Add an action to the source stage. Specify the newly created project as the action provider. Specify the build attract as the actions input artifact.
Answers
E.
Add a new stage to the pipeline alter the source stage. Add an action to the new stage. Speedy the newly created protect as the action provider. Specify the source artifact as the action's input artifact.
E.
Add a new stage to the pipeline alter the source stage. Add an action to the new stage. Speedy the newly created protect as the action provider. Specify the source artifact as the action's input artifact.
Answers
Suggested answer: B, E

Explanation:

This solution will ensure that the repository package's unit tests run in the new deployment environment with the least overhead because it uses AWS CodeBuild to build and test the code in a fully managed service, and AWS CodePipeline to orchestrate the deployment stages and actions.

Option A is not optimal because it will use AWS CodeCommit instead of AWS CodeBuild, which is a source control service, not a build and test service. Option C is not optimal because it will use AWS CodeDeploy instead of AWS CodeBuild, which is a deployment service, not a build and test service.

Option D is not optimal because it will add an action to the source stage instead of creating a new stage, which will not follow the best practice of separating different deployment phases.

Reference: AWS CodeBuild, AWS CodePipeline

A developer is trying get data from an Amazon DynamoDB table called demoman-table. The developer configured the AWS CLI to use a specific IAM use's credentials and ran the following command.

The command returned errors and no rows were returned.

What is the MOST likely cause of these issues?

A.
The command is incorrect; it should be rewritten to use put-item with a string argument
A.
The command is incorrect; it should be rewritten to use put-item with a string argument
Answers
B.
The developer needs to log a ticket with AWS Support to enable access to the demoman-table
B.
The developer needs to log a ticket with AWS Support to enable access to the demoman-table
Answers
C.
Amazon DynamoOB cannot be accessed from the AWS CLI and needs to called via the REST API
C.
Amazon DynamoOB cannot be accessed from the AWS CLI and needs to called via the REST API
Answers
D.
The IAM user needs an associated policy with read access to demoman-table
D.
The IAM user needs an associated policy with read access to demoman-table
Answers
Suggested answer: D

Explanation:

This solution will most likely solve the issues because it will grant the IAM user the necessary permission to access the DynamoDB table using the AWS CLI command. The error message indicates that the IAM user does not have sufficient access rights to perform the scan operation on the table.

Option A is not optimal because it will change the command to use put-item instead of scan, which will not achieve the desired result of getting data from the table. Option B is not optimal because it will involve contacting AWS Support, which may not be necessary or efficient for this issue. Option C is not optimal because it will state that DynamoDB cannot be accessed from the AWS CLI, which is incorrect as DynamoDB supports AWS CLI commands.

Reference: AWS CLI for DynamoDB, [IAM Policies for DynamoDB]

Total 292 questions
Go to page: of 30