ExamGecko
Home Home / Amazon / DOP-C01

Amazon DOP-C01 Practice Test - Questions Answers, Page 42

Question list
Search
Search

List of questions

Search

Related questions











You have been asked to handle a large data migration from multiple Amazon RDS MySQL instances to a DynamoDB table. You have been given a short amount of time to complete the data migration. What will allow you to complete this complex data processing workflow?

A.
Create an Amazon Kinesis data stream, pipe in all of the Amazon RDS data, and direct the data toward a DynamoDB table.
A.
Create an Amazon Kinesis data stream, pipe in all of the Amazon RDS data, and direct the data toward a DynamoDB table.
Answers
B.
Write a script in your language of choice, install the script on an Amazon EC2 instance, and then use Auto Scaling groups to ensure that the latency of the migration pipelines never exceeds four seconds in any 15-minute period.
B.
Write a script in your language of choice, install the script on an Amazon EC2 instance, and then use Auto Scaling groups to ensure that the latency of the migration pipelines never exceeds four seconds in any 15-minute period.
Answers
C.
Write a bash script to run on your Amazon RDS instance that will export data into DynamoDB.
C.
Write a bash script to run on your Amazon RDS instance that will export data into DynamoDB.
Answers
D.
Create a data pipeline to export Amazon RDS data and import the data into DynamoDB.
D.
Create a data pipeline to export Amazon RDS data and import the data into DynamoDB.
Answers
Suggested answer: D

A DevOps Engineer encountered the following error when attempting to use an AWS CloudFormation template to create an Amazon ECS cluster:

An error occurred (InsufficientCapabilitiesException) when calling the CreateStack operation.

What caused this error and what steps need to be taken to allow the Engineer to successfully execute the AWS CloudFormation template?

A.
The AWS user or role attempting to execute the CloudFormation template does not have the permissions required to create the resources within the template. The Engineer must review the user policies and add any permissions needed to create the resources and then rerun the template execution.
A.
The AWS user or role attempting to execute the CloudFormation template does not have the permissions required to create the resources within the template. The Engineer must review the user policies and add any permissions needed to create the resources and then rerun the template execution.
Answers
B.
The AWS CloudFormation service cannot be reached and is not capable of creating the cluster. The Engineer needs to confirm that routing and firewall rules are not preventing the AWS CloudFormation script from communicating with the AWS service endpoints, and then rerun the template execution.
B.
The AWS CloudFormation service cannot be reached and is not capable of creating the cluster. The Engineer needs to confirm that routing and firewall rules are not preventing the AWS CloudFormation script from communicating with the AWS service endpoints, and then rerun the template execution.
Answers
C.
The CloudFormation execution was not granted the capability to create IAM resources. The Engineer needs to provide CAPABILITY_IAM and CAPABILITY_NAMED_IAM as capabilities in the CloudFormation execution parameters or provide the capabilities in the AWS Management Console.
C.
The CloudFormation execution was not granted the capability to create IAM resources. The Engineer needs to provide CAPABILITY_IAM and CAPABILITY_NAMED_IAM as capabilities in the CloudFormation execution parameters or provide the capabilities in the AWS Management Console.
Answers
D.
CloudFormation is not capable of fulfilling the request of the specified resources in the current AWS Region. The Engineer needs to specify a new region and rerun the template.
D.
CloudFormation is not capable of fulfilling the request of the specified resources in the current AWS Region. The Engineer needs to specify a new region and rerun the template.
Answers
Suggested answer: C

Explanation:

Reference: https://github.com/awslabs/serverless-application-model/issues/51

Your application Amazon Elastic Compute Cloud (EC2) instances bootstrap by using a master configuration file that is kept in a version-enabled Amazon Simple Storage Service (S3) bucket. Which one of the following methods should you use to securely install the current configuration version onto the instances in a cost-effective way?

A.
Create an Amazon DynamoDB table to store the different versions of the configuration file. Associate AWS Identity and Access Management (IAM) EC2 roles to the Amazon EC2 instances, and reference the DynamoDB table to get the latest file from Amazon Simple Storage Service (S3).
A.
Create an Amazon DynamoDB table to store the different versions of the configuration file. Associate AWS Identity and Access Management (IAM) EC2 roles to the Amazon EC2 instances, and reference the DynamoDB table to get the latest file from Amazon Simple Storage Service (S3).
Answers
B.
Associate an IAM S3 role to the bucket, list the object versions using the Amazon S3 API, and then get the latest object.
B.
Associate an IAM S3 role to the bucket, list the object versions using the Amazon S3 API, and then get the latest object.
Answers
C.
Associate an IAM EC2 role to the instances, list the object versions using the Amazon S3 API, and then get the latest object.
C.
Associate an IAM EC2 role to the instances, list the object versions using the Amazon S3 API, and then get the latest object.
Answers
D.
Associate an IAM EC2 role to the instances, and then simply get the object from Amazon S3, because the default is the current version.
D.
Associate an IAM EC2 role to the instances, and then simply get the object from Amazon S3, because the default is the current version.
Answers
E.
Store the IAM credentials in the Amazon EC2 user data for each instance, and then simply get the object from S3, because the default is the current version.
E.
Store the IAM credentials in the Amazon EC2 user data for each instance, and then simply get the object from S3, because the default is the current version.
Answers
Suggested answer: D

You are using a configuration management system to manage your Amazon EC2 instances. On your Amazon EC2 Instances, you want to store credentials for connecting to an Amazon RDS DB instance. How should you securely store these credentials?

A.
Give the Amazon EC2 instances an IAM role that allows read access to a private Amazon S3 bucket. Store a file with database credentials in the Amazon S3 bucket. Have your configuration management system pull the file from the bucket when it is needed.
A.
Give the Amazon EC2 instances an IAM role that allows read access to a private Amazon S3 bucket. Store a file with database credentials in the Amazon S3 bucket. Have your configuration management system pull the file from the bucket when it is needed.
Answers
B.
Launch an Amazon EC2 instance and use the configuration management system to bootstrap the instance with the Amazon RDS DB credentials. Create an AMI from this instance.
B.
Launch an Amazon EC2 instance and use the configuration management system to bootstrap the instance with the Amazon RDS DB credentials. Create an AMI from this instance.
Answers
C.
Store the Amazon RDS DB credentials in Amazon EC2 user data. Import the credentials into the Instance on boot.
C.
Store the Amazon RDS DB credentials in Amazon EC2 user data. Import the credentials into the Instance on boot.
Answers
D.
Assign an IAM role to your Amazon RDS instance, and use this IAM role to access the Amazon RDS DB from your Amazon EC2 instances.
D.
Assign an IAM role to your Amazon RDS instance, and use this IAM role to access the Amazon RDS DB from your Amazon EC2 instances.
Answers
E.
Store your credentials in your version control system, in plaintext. Check out a copy of your credentials from the version control system on boot. Use Amazon EBS encryption on the volume storing the Amazon RDS DB credentials.
E.
Store your credentials in your version control system, in plaintext. Check out a copy of your credentials from the version control system on boot. Use Amazon EBS encryption on the volume storing the Amazon RDS DB credentials.
Answers
Suggested answer: A

A company is adopting serverless computing and is migrating some of its existing applications to AWS Lambda. A DevOps engineer must come up with an automated deployment strategy using AWS CodePipeline that should include proper version controls, branching strategies, and rollback methods. Which combination of steps should the DevOps engineer follow when setting up the pipeline? (Choose three.)

A.
Use Amazon S3 as the source code repository.
A.
Use Amazon S3 as the source code repository.
Answers
B.
Use AWS CodeCommit as the source code repository.
B.
Use AWS CodeCommit as the source code repository.
Answers
C.
Use AWS CloudFormation to create an AWS Serverless Application Model (AWS SAM) template for deployment.
C.
Use AWS CloudFormation to create an AWS Serverless Application Model (AWS SAM) template for deployment.
Answers
D.
Use AWS CodeBuild to create an AWS Serverless Application Model (AWS SAM) template for deployment.
D.
Use AWS CodeBuild to create an AWS Serverless Application Model (AWS SAM) template for deployment.
Answers
E.
Use AWS CloudFormation to deploy the application.
E.
Use AWS CloudFormation to deploy the application.
Answers
F.
Use AWS CodeDeploy to deploy the application.
F.
Use AWS CodeDeploy to deploy the application.
Answers
Suggested answer: B, C, F

A government agency has multiple AWS accounts, many of which store sensitive citizen information. A Security team wants to detect anomalous account and network activities (such as SSH brute force attacks) in any account and centralize that information in a dedicated security account. Event information should be stored in an Amazon S3 bucket in the security account, which is monitored by the department’s Security Information and Event Management (SIEM) system. How can this be accomplished?

A.
Enable Amazon Macie in every account. Configure the security account as the Macie Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch Events rule in the security account to send all findings to Amazon Kinesis Data Firehose, which should push the findings to the S3 bucket.
A.
Enable Amazon Macie in every account. Configure the security account as the Macie Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch Events rule in the security account to send all findings to Amazon Kinesis Data Firehose, which should push the findings to the S3 bucket.
Answers
B.
Enable Amazon Macie in the security account only. Configure the security account as the Macie Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch Events rule in the security account to send all findings to Amazon Kinesis Data Streams. Write an application using KCL to read data from the Kinesis Data Streams and write to the S3 bucket.
B.
Enable Amazon Macie in the security account only. Configure the security account as the Macie Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch Events rule in the security account to send all findings to Amazon Kinesis Data Streams. Write an application using KCL to read data from the Kinesis Data Streams and write to the S3 bucket.
Answers
C.
Enable Amazon GuardDuty in every account. Configure the security account as the GuardDuty Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch rule in the security account to send all findings to Amazon Kinesis Data Firehose, which will push the findings to the S3 bucket.
C.
Enable Amazon GuardDuty in every account. Configure the security account as the GuardDuty Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch rule in the security account to send all findings to Amazon Kinesis Data Firehose, which will push the findings to the S3 bucket.
Answers
D.
Enable Amazon GuardDuty in the security account only. Configure the security account as the GuardDuty Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch rule in the security account to send all findings to Amazon Kinesis Data Streams. Write an application using KCL to read data from Kinesis Data Streams and write to the S3 bucket.
D.
Enable Amazon GuardDuty in the security account only. Configure the security account as the GuardDuty Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch rule in the security account to send all findings to Amazon Kinesis Data Streams. Write an application using KCL to read data from Kinesis Data Streams and write to the S3 bucket.
Answers
Suggested answer: C

You have a high security requirement for your AWS accounts.

What is the most rapid and sophisticated setup you can use to react to AWS API calls to your account?

A.
Subscription to AWS Config via an SNS Topic. Use a Lambda Function to perform in-flight analysis and reactivity to changes as they occur.
A.
Subscription to AWS Config via an SNS Topic. Use a Lambda Function to perform in-flight analysis and reactivity to changes as they occur.
Answers
B.
Global AWS CloudTrail setup delivering to S3 with an SNS subscription to the deliver notifications, pushing into a Lambda, which inserts records into an ELK stack for analysis.
B.
Global AWS CloudTrail setup delivering to S3 with an SNS subscription to the deliver notifications, pushing into a Lambda, which inserts records into an ELK stack for analysis.
Answers
C.
Use a CloudWatch Rule ScheduleExpression to periodically analyze IAM credential logs. Push the deltas for events into an ELK stack and perform ad-hoc analysis there.
C.
Use a CloudWatch Rule ScheduleExpression to periodically analyze IAM credential logs. Push the deltas for events into an ELK stack and perform ad-hoc analysis there.
Answers
D.
CloudWatch Events Rules which trigger based on all AWS API calls, submitting all events to an AWS Kinesis Stream for arbitrary downstream analysis.
D.
CloudWatch Events Rules which trigger based on all AWS API calls, submitting all events to an AWS Kinesis Stream for arbitrary downstream analysis.
Answers
Suggested answer: B

A Development team uses AWS CodeCommit for source code control. Developers apply their changes to various feature branches and create pull requests to move those changes to the master branch when they are ready for production. A direct push to the master branch should not be allowed. The team applied the AWS managed policy AWSCodeCommitPowerUser to the Developers’ IAM Rote, but now members are able to push to the master branch directly on every repository in the AWS account. What actions should be taken to restrict this?

A.
Create an additional policy to include a deny rule for the codecommit:GitPush action, and include a restriction for the specific repositories in the resource statement with a condition for the master reference.
A.
Create an additional policy to include a deny rule for the codecommit:GitPush action, and include a restriction for the specific repositories in the resource statement with a condition for the master reference.
Answers
B.
Remove the IAM policy and add an AWSCodeCommitReadOnly policy. Add an allow rule for the codecommit:GitPush action for the specific repositories in the resource statement with a condition for the master reference.
B.
Remove the IAM policy and add an AWSCodeCommitReadOnly policy. Add an allow rule for the codecommit:GitPush action for the specific repositories in the resource statement with a condition for the master reference.
Answers
C.
Modify the IAM policy and include a deny rule for the codecommit:GitPush action for the specific repositories in the resource statement with a condition for the master reference.
C.
Modify the IAM policy and include a deny rule for the codecommit:GitPush action for the specific repositories in the resource statement with a condition for the master reference.
Answers
D.
Create an additional policy to include an allow rule for the codecommit:GitPush action and include a restriction for the specific repositories in the resource statement with a condition for the feature branches reference.
D.
Create an additional policy to include an allow rule for the codecommit:GitPush action and include a restriction for the specific repositories in the resource statement with a condition for the feature branches reference.
Answers
Suggested answer: A

Explanation:

Reference:

https://aws.amazon.com/pt/blogs/devops/refining-access-to-branches-in-aws-codecommit/

By default, Amazon CloudTrail logs ____ actions defined by the CloudTrail ____ APIs.

A.
bucket-level; RESTful
A.
bucket-level; RESTful
Answers
B.
object-level; RESTful
B.
object-level; RESTful
Answers
C.
object-level; SDK
C.
object-level; SDK
Answers
D.
bucket-level; SDK
D.
bucket-level; SDK
Answers
Suggested answer: A

Explanation:

By default, CloudTrail logs bucket-level actions. Amazon S3 records are written together with other AWS service records in a log file. Amazon S3 bucket-level actions supported for logging by CloudTrail are defined in its RESTful API.

Reference: http://docs.aws.amazon.com/AmazonS3/latest/dev/cloudtrail-logging.html

A company is using AWS CodeDeploy to manage its application deployments. Recently, the Development team decided to use GitHub for version control, and the team is looking for ways to integrate the GitHub repository with CodeDeploy. The team also needs to develop a way to automate deployment whenever there is a new commit on that repository. How can the integration be achieved in the MOST efficient way?

A.
Create a GitHub webhook to replicate the repository to AWS CodeCommit. Create an AWS CodePipeline pipeline that uses CodeCommit as a source provider and AWS CodeDeploy as a deployment provider. Once configured, commit a change to the GitHub repository to start the first deployment.
A.
Create a GitHub webhook to replicate the repository to AWS CodeCommit. Create an AWS CodePipeline pipeline that uses CodeCommit as a source provider and AWS CodeDeploy as a deployment provider. Once configured, commit a change to the GitHub repository to start the first deployment.
Answers
B.
Create an AWS CodePipeline pipeline that uses GitHub as a source provider and AWS CodeDeploy as a deployment provider. Connect this new pipeline with the GitHub account and instruct CodePipeline to use webhooks in GitHub to automatically start the pipeline when a change occurs.
B.
Create an AWS CodePipeline pipeline that uses GitHub as a source provider and AWS CodeDeploy as a deployment provider. Connect this new pipeline with the GitHub account and instruct CodePipeline to use webhooks in GitHub to automatically start the pipeline when a change occurs.
Answers
C.
Create an AWS Lambda function to check periodically if there has been a new commit within the GitHub repository. If a new commit is found, trigger a CreateDeployment API call to AWS CodeDeploy to start a new deployment based on the last commit ID within the deployment group.
C.
Create an AWS Lambda function to check periodically if there has been a new commit within the GitHub repository. If a new commit is found, trigger a CreateDeployment API call to AWS CodeDeploy to start a new deployment based on the last commit ID within the deployment group.
Answers
D.
Create an AWS CodeDeploy custom deployment configuration to associate the GitHub repository with the deployment group. During the association process, authenticate the deployment group with GitHub to obtain the GitHub security authentication token. Configure the deployment group options to automatically deploy if a new commit is found. Perform a new commit to the GitHub repository to trigger the first deployment.
D.
Create an AWS CodeDeploy custom deployment configuration to associate the GitHub repository with the deployment group. During the association process, authenticate the deployment group with GitHub to obtain the GitHub security authentication token. Configure the deployment group options to automatically deploy if a new commit is found. Perform a new commit to the GitHub repository to trigger the first deployment.
Answers
Suggested answer: B
Total 557 questions
Go to page: of 56