ExamGecko
Home / Amazon / DOP-C01 / List of questions
Ask Question

Amazon DOP-C01 Practice Test - Questions Answers, Page 42

List of questions

Question 411

Report
Export
Collapse

You have been asked to handle a large data migration from multiple Amazon RDS MySQL instances to a DynamoDB table. You have been given a short amount of time to complete the data migration. What will allow you to complete this complex data processing workflow?

Create an Amazon Kinesis data stream, pipe in all of the Amazon RDS data, and direct the data toward a DynamoDB table.
Create an Amazon Kinesis data stream, pipe in all of the Amazon RDS data, and direct the data toward a DynamoDB table.
Write a script in your language of choice, install the script on an Amazon EC2 instance, and then use Auto Scaling groups to ensure that the latency of the migration pipelines never exceeds four seconds in any 15-minute period.
Write a script in your language of choice, install the script on an Amazon EC2 instance, and then use Auto Scaling groups to ensure that the latency of the migration pipelines never exceeds four seconds in any 15-minute period.
Write a bash script to run on your Amazon RDS instance that will export data into DynamoDB.
Write a bash script to run on your Amazon RDS instance that will export data into DynamoDB.
Create a data pipeline to export Amazon RDS data and import the data into DynamoDB.
Create a data pipeline to export Amazon RDS data and import the data into DynamoDB.
Suggested answer: D
asked 16/09/2024
Ruben Munilla Hernandez
42 questions

Question 412

Report
Export
Collapse

A DevOps Engineer encountered the following error when attempting to use an AWS CloudFormation template to create an Amazon ECS cluster:

An error occurred (InsufficientCapabilitiesException) when calling the CreateStack operation.

What caused this error and what steps need to be taken to allow the Engineer to successfully execute the AWS CloudFormation template?

The AWS user or role attempting to execute the CloudFormation template does not have the permissions required to create the resources within the template. The Engineer must review the user policies and add any permissions needed to create the resources and then rerun the template execution.
The AWS user or role attempting to execute the CloudFormation template does not have the permissions required to create the resources within the template. The Engineer must review the user policies and add any permissions needed to create the resources and then rerun the template execution.
The AWS CloudFormation service cannot be reached and is not capable of creating the cluster. The Engineer needs to confirm that routing and firewall rules are not preventing the AWS CloudFormation script from communicating with the AWS service endpoints, and then rerun the template execution.
The AWS CloudFormation service cannot be reached and is not capable of creating the cluster. The Engineer needs to confirm that routing and firewall rules are not preventing the AWS CloudFormation script from communicating with the AWS service endpoints, and then rerun the template execution.
The CloudFormation execution was not granted the capability to create IAM resources. The Engineer needs to provide CAPABILITY_IAM and CAPABILITY_NAMED_IAM as capabilities in the CloudFormation execution parameters or provide the capabilities in the AWS Management Console.
The CloudFormation execution was not granted the capability to create IAM resources. The Engineer needs to provide CAPABILITY_IAM and CAPABILITY_NAMED_IAM as capabilities in the CloudFormation execution parameters or provide the capabilities in the AWS Management Console.
CloudFormation is not capable of fulfilling the request of the specified resources in the current AWS Region. The Engineer needs to specify a new region and rerun the template.
CloudFormation is not capable of fulfilling the request of the specified resources in the current AWS Region. The Engineer needs to specify a new region and rerun the template.
Suggested answer: C

Explanation:

Reference: https://github.com/awslabs/serverless-application-model/issues/51

asked 16/09/2024
Brad Mateski
40 questions

Question 413

Report
Export
Collapse

Your application Amazon Elastic Compute Cloud (EC2) instances bootstrap by using a master configuration file that is kept in a version-enabled Amazon Simple Storage Service (S3) bucket. Which one of the following methods should you use to securely install the current configuration version onto the instances in a cost-effective way?

Create an Amazon DynamoDB table to store the different versions of the configuration file. Associate AWS Identity and Access Management (IAM) EC2 roles to the Amazon EC2 instances, and reference the DynamoDB table to get the latest file from Amazon Simple Storage Service (S3).
Create an Amazon DynamoDB table to store the different versions of the configuration file. Associate AWS Identity and Access Management (IAM) EC2 roles to the Amazon EC2 instances, and reference the DynamoDB table to get the latest file from Amazon Simple Storage Service (S3).
Associate an IAM S3 role to the bucket, list the object versions using the Amazon S3 API, and then get the latest object.
Associate an IAM S3 role to the bucket, list the object versions using the Amazon S3 API, and then get the latest object.
Associate an IAM EC2 role to the instances, list the object versions using the Amazon S3 API, and then get the latest object.
Associate an IAM EC2 role to the instances, list the object versions using the Amazon S3 API, and then get the latest object.
Associate an IAM EC2 role to the instances, and then simply get the object from Amazon S3, because the default is the current version.
Associate an IAM EC2 role to the instances, and then simply get the object from Amazon S3, because the default is the current version.
Store the IAM credentials in the Amazon EC2 user data for each instance, and then simply get the object from S3, because the default is the current version.
Store the IAM credentials in the Amazon EC2 user data for each instance, and then simply get the object from S3, because the default is the current version.
Suggested answer: D
asked 16/09/2024
Melih Sivrikaya
37 questions

Question 414

Report
Export
Collapse

You are using a configuration management system to manage your Amazon EC2 instances. On your Amazon EC2 Instances, you want to store credentials for connecting to an Amazon RDS DB instance. How should you securely store these credentials?

Give the Amazon EC2 instances an IAM role that allows read access to a private Amazon S3 bucket. Store a file with database credentials in the Amazon S3 bucket. Have your configuration management system pull the file from the bucket when it is needed.
Give the Amazon EC2 instances an IAM role that allows read access to a private Amazon S3 bucket. Store a file with database credentials in the Amazon S3 bucket. Have your configuration management system pull the file from the bucket when it is needed.
Launch an Amazon EC2 instance and use the configuration management system to bootstrap the instance with the Amazon RDS DB credentials. Create an AMI from this instance.
Launch an Amazon EC2 instance and use the configuration management system to bootstrap the instance with the Amazon RDS DB credentials. Create an AMI from this instance.
Store the Amazon RDS DB credentials in Amazon EC2 user data. Import the credentials into the Instance on boot.
Store the Amazon RDS DB credentials in Amazon EC2 user data. Import the credentials into the Instance on boot.
Assign an IAM role to your Amazon RDS instance, and use this IAM role to access the Amazon RDS DB from your Amazon EC2 instances.
Assign an IAM role to your Amazon RDS instance, and use this IAM role to access the Amazon RDS DB from your Amazon EC2 instances.
Store your credentials in your version control system, in plaintext. Check out a copy of your credentials from the version control system on boot. Use Amazon EBS encryption on the volume storing the Amazon RDS DB credentials.
Store your credentials in your version control system, in plaintext. Check out a copy of your credentials from the version control system on boot. Use Amazon EBS encryption on the volume storing the Amazon RDS DB credentials.
Suggested answer: A
asked 16/09/2024
Ronakkumar Shyani
47 questions

Question 415

Report
Export
Collapse

A company is adopting serverless computing and is migrating some of its existing applications to AWS Lambda. A DevOps engineer must come up with an automated deployment strategy using AWS CodePipeline that should include proper version controls, branching strategies, and rollback methods. Which combination of steps should the DevOps engineer follow when setting up the pipeline? (Choose three.)

Use Amazon S3 as the source code repository.
Use Amazon S3 as the source code repository.
Use AWS CodeCommit as the source code repository.
Use AWS CodeCommit as the source code repository.
Use AWS CloudFormation to create an AWS Serverless Application Model (AWS SAM) template for deployment.
Use AWS CloudFormation to create an AWS Serverless Application Model (AWS SAM) template for deployment.
Use AWS CodeBuild to create an AWS Serverless Application Model (AWS SAM) template for deployment.
Use AWS CodeBuild to create an AWS Serverless Application Model (AWS SAM) template for deployment.
Use AWS CloudFormation to deploy the application.
Use AWS CloudFormation to deploy the application.
Use AWS CodeDeploy to deploy the application.
Use AWS CodeDeploy to deploy the application.
Suggested answer: B, C, F
asked 16/09/2024
Tim Klein
37 questions

Question 416

Report
Export
Collapse

A government agency has multiple AWS accounts, many of which store sensitive citizen information. A Security team wants to detect anomalous account and network activities (such as SSH brute force attacks) in any account and centralize that information in a dedicated security account. Event information should be stored in an Amazon S3 bucket in the security account, which is monitored by the department’s Security Information and Event Management (SIEM) system. How can this be accomplished?

Enable Amazon Macie in every account. Configure the security account as the Macie Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch Events rule in the security account to send all findings to Amazon Kinesis Data Firehose, which should push the findings to the S3 bucket.
Enable Amazon Macie in every account. Configure the security account as the Macie Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch Events rule in the security account to send all findings to Amazon Kinesis Data Firehose, which should push the findings to the S3 bucket.
Enable Amazon Macie in the security account only. Configure the security account as the Macie Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch Events rule in the security account to send all findings to Amazon Kinesis Data Streams. Write an application using KCL to read data from the Kinesis Data Streams and write to the S3 bucket.
Enable Amazon Macie in the security account only. Configure the security account as the Macie Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch Events rule in the security account to send all findings to Amazon Kinesis Data Streams. Write an application using KCL to read data from the Kinesis Data Streams and write to the S3 bucket.
Enable Amazon GuardDuty in every account. Configure the security account as the GuardDuty Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch rule in the security account to send all findings to Amazon Kinesis Data Firehose, which will push the findings to the S3 bucket.
Enable Amazon GuardDuty in every account. Configure the security account as the GuardDuty Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch rule in the security account to send all findings to Amazon Kinesis Data Firehose, which will push the findings to the S3 bucket.
Enable Amazon GuardDuty in the security account only. Configure the security account as the GuardDuty Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch rule in the security account to send all findings to Amazon Kinesis Data Streams. Write an application using KCL to read data from Kinesis Data Streams and write to the S3 bucket.
Enable Amazon GuardDuty in the security account only. Configure the security account as the GuardDuty Administrator for every member account using invitation/acceptance. Create an Amazon CloudWatch rule in the security account to send all findings to Amazon Kinesis Data Streams. Write an application using KCL to read data from Kinesis Data Streams and write to the S3 bucket.
Suggested answer: C
asked 16/09/2024
Kaisheng Wang
29 questions

Question 417

Report
Export
Collapse

You have a high security requirement for your AWS accounts.

What is the most rapid and sophisticated setup you can use to react to AWS API calls to your account?

Become a Premium Member for full access
  Unlock Premium Member

Question 418

Report
Export
Collapse

A Development team uses AWS CodeCommit for source code control. Developers apply their changes to various feature branches and create pull requests to move those changes to the master branch when they are ready for production. A direct push to the master branch should not be allowed. The team applied the AWS managed policy AWSCodeCommitPowerUser to the Developers’ IAM Rote, but now members are able to push to the master branch directly on every repository in the AWS account. What actions should be taken to restrict this?

Become a Premium Member for full access
  Unlock Premium Member

Question 419

Report
Export
Collapse

By default, Amazon CloudTrail logs ____ actions defined by the CloudTrail ____ APIs.

Become a Premium Member for full access
  Unlock Premium Member

Question 420

Report
Export
Collapse

A company is using AWS CodeDeploy to manage its application deployments. Recently, the Development team decided to use GitHub for version control, and the team is looking for ways to integrate the GitHub repository with CodeDeploy. The team also needs to develop a way to automate deployment whenever there is a new commit on that repository. How can the integration be achieved in the MOST efficient way?

Become a Premium Member for full access
  Unlock Premium Member
Total 557 questions
Go to page: of 56
Search

Related questions