ExamGecko
Home Home / Amazon / CLF-C02

Amazon CLF-C02 Practice Test - Questions Answers, Page 52

Question list
Search
Search

List of questions

Search

Related questions











A company website is experiencing DDoS attacks.

Which AWS service can help protect the company website against these attacks?

A.
AWS Resource Access Manager
A.
AWS Resource Access Manager
Answers
B.
AWS Amplify
B.
AWS Amplify
Answers
C.
AWS Shield
C.
AWS Shield
Answers
D.
Amazon GuardDuty
D.
Amazon GuardDuty
Answers
Suggested answer: C

Explanation:

AWS Shield is a managed DDoS protection service that safeguards applications running on AWS from distributed denial of service (DDoS) attacks. DDoS attacks are malicious attempts to disrupt the normal functioning of a website or application by overwhelming it with a large volume of traffic from multiple sources. AWS Shield provides two tiers of protection: Standard and Advanced. AWS Shield Standard is automatically enabled for all AWS customers at no additional cost. It protects your AWS resources, such as Amazon CloudFront, AWS Global Accelerator, and Amazon Route 53, from the most common and frequently occurring network and transport layer DDoS attacks. AWS Shield Advanced is an optional paid service that provides additional protection for your AWS resources and applications, such as Amazon Elastic Compute Cloud (Amazon EC2), Elastic Load Balancing (ELB), Amazon Simple Storage Service (Amazon S3), Amazon Relational Database Service (Amazon RDS), and AWS Elastic Beanstalk.AWS Shield Advanced offers enhanced detection and mitigation capabilities, 24/7 access to the AWS DDoS Response Team (DRT), real-time visibility and reporting, and cost protection against DDoS-related spikes in your AWS bill12

A company wants a customized assessment of its current on-premises environment. The company wants to understand its projected running costs in the AWS Cloud.

Which AWS service or tool will meet these requirements?

A.
AWS Trusted Advisor
A.
AWS Trusted Advisor
Answers
B.
Amazon Inspector
B.
Amazon Inspector
Answers
C.
AWS Control Tower
C.
AWS Control Tower
Answers
D.
Migration Evaluator
D.
Migration Evaluator
Answers
Suggested answer: D

Explanation:

Migration Evaluator is an AWS service that provides a customized assessment of your current on-premises environment and helps you build a data-driven business case for migration to AWS. Migration Evaluator collects and analyzes data from your on-premises servers, such as CPU, memory, disk, network, and utilization metrics, and compares them with the most cost-effective AWS alternatives. Migration Evaluator also helps you understand your existing software licenses and running costs, and provides recommendations for Bring Your Own License (BYOL) and License Included (LI) options in AWS. Migration Evaluator generates a detailed report that shows your projected running costs in the AWS Cloud, along with potential savings and benefits. You can use this report to support your decision-making and planning for cloud migration.Reference:Cloud Business Case & Migration Plan - Amazon Migration Evaluator - AWS,Getting started with Migration Evaluator

A company that has multiple business units wants to centrally manage and govern its AWS Cloud environments. The company wants to automate the creation of AWS accounts, apply service control policies (SCPs), and simplify billing processes.

Which AWS service or tool should the company use to meet these requirements?

A.
AWS Organizations
A.
AWS Organizations
Answers
B.
Cost Explorer
B.
Cost Explorer
Answers
C.
AWS Budgets
C.
AWS Budgets
Answers
D.
AWS Trusted Advisor
D.
AWS Trusted Advisor
Answers
Suggested answer: A

Explanation:

AWS Organizations is an AWS service that enables you to centrally manage and govern your AWS Cloud environments across multiple business units. AWS Organizations allows you to create an organization that consists of AWS accounts that you create or invite to join. You can group your accounts into organizational units (OUs) and apply service control policies (SCPs) to them. SCPs are a type of policy that specify the maximum permissions for the accounts in your organization, and can help you enforce compliance and security requirements. AWS Organizations also simplifies billing processes by enabling you to consolidate and pay for all member accounts with a single payment method. You can also use AWS Organizations to automate the creation of AWS accounts by using APIs or AWS CloudFormation templates.Reference:What is AWS Organizations?,Policy-Based Management - AWS Organizations

According to security best practices, how should an Amazon EC2 instance be given access to an Amazon S3 bucket?

A.
Hard code an IAM user's secret key and access key directly in the application, and upload the file.
A.
Hard code an IAM user's secret key and access key directly in the application, and upload the file.
Answers
B.
Store the IAM user's secret key and access key in a text file on the EC2 instance, read the keys, then upload the file.
B.
Store the IAM user's secret key and access key in a text file on the EC2 instance, read the keys, then upload the file.
Answers
C.
Have the EC2 instance assume a role to obtain the privileges to upload the file.
C.
Have the EC2 instance assume a role to obtain the privileges to upload the file.
Answers
D.
Modify the S3 bucket policy so that any service can upload to it at any time.
D.
Modify the S3 bucket policy so that any service can upload to it at any time.
Answers
Suggested answer: C

Explanation:

According to security best practices, the best way to give an Amazon EC2 instance access to an Amazon S3 bucket is to have the EC2 instance assume a role to obtain the privileges to upload the file. A role is an AWS Identity and Access Management (IAM) entity that defines a set of permissions for making AWS service requests. You can use roles to delegate access to users, applications, or services that don't normally have access to your AWS resources. For example, you can create a role that allows EC2 instances to access S3 buckets, and then attach the role to the EC2 instance. This way, the EC2 instance can assume the role and obtain temporary security credentials to access the S3 bucket. This method is more secure and scalable than storing or hardcoding IAM user credentials on the EC2 instance, as it avoids the risk of exposing or compromising the credentials. It also allows you to manage the permissions centrally and dynamically, and to audit the access using AWS CloudTrail.For more information on how to create and use roles for EC2 instances, see Using an IAM role to grant permissions to applications running on Amazon EC2 instances1

The other options are not recommended for security reasons. Hardcoding or storing IAM user credentials on the EC2 instance is a bad practice, as it exposes the credentials to potential attackers or unauthorized users who can access the instance or the application code. It also makes it difficult to rotate or revoke the credentials, and to track the usage of the credentials. Modifying the S3 bucket policy to allow any service to upload to it at any time is also a bad practice, as it opens the bucket to potential data breaches, data loss, or data corruption. It also violates the principle of least privilege, which states that you should grant only the minimum permissions necessary for a task.

What is the purpose of having an internet gateway within a VPC?

A.
To create a VPN connection to the VPC
A.
To create a VPN connection to the VPC
Answers
B.
To allow communication between the VPC and the internet
B.
To allow communication between the VPC and the internet
Answers
C.
To impose bandwidth constraints on internet traffic
C.
To impose bandwidth constraints on internet traffic
Answers
D.
To load balance traffic from the internet across Amazon EC2 instances
D.
To load balance traffic from the internet across Amazon EC2 instances
Answers
Suggested answer: B

Explanation:

An internet gateway is a service that allows for internet traffic to enter into a VPC. Otherwise, a VPC is completely segmented off and then the only way to get to it is potentially through a VPN connection rather than through internet connection. An internet gateway is a logical connection between an AWS VPC and the internet. It supports IPv4 and IPv6 traffic.It does not cause availability risks or bandwidth constraints on your network traffic1. An internet gateway enables resources in your public subnets (such as EC2 instances) to connect to the internet if the resource has a public IPv4 address or an IPv6 address.Similarly, resources on the internet can initiate a connection to resources in your subnet using the public IPv4 address or IPv6 address2. An internet gateway also provides a target in your VPC route tables for internet-routable traffic. For communication using IPv4, the internet gateway also performs network address translation (NAT).For communication using IPv6, NAT is not needed because IPv6 addresses are public2.To enable access to or from the internet for instances in a subnet in a VPC using an internet gateway, you must create an internet gateway and attach it to your VPC, add a route to your subnet's route table that directs internet-bound traffic to the internet gateway, ensure that instances in your subnet have a public IPv4 address or an IPv6 address, and ensure that your network access control lists and security group rules allow the desired internet traffic to flow to and from your instance2.Reference:Connect to the internet using an internet gateway,AWS Internet Gateway and VPC Routing

A company is hosting an application in the AWS Cloud. The company wants to verify that underlying AWS services and general AWS infrastructure are operating normally.

Which combination of AWS services can the company use to gather the required information? (Select TWO.)

A.
AWS Personal Health Dashboard
A.
AWS Personal Health Dashboard
Answers
B.
AWS Systems Manager
B.
AWS Systems Manager
Answers
C.
AWS Trusted Advisor
C.
AWS Trusted Advisor
Answers
D.
AWS Service Health Dashboard
D.
AWS Service Health Dashboard
Answers
E.
AWS Service Catalog
E.
AWS Service Catalog
Answers
Suggested answer: A, D

Explanation:

AWS Personal Health Dashboard and AWS Service Health Dashboard are two AWS services that can help the company to verify that underlying AWS services and general AWS infrastructure are operating normally. AWS Personal Health Dashboard provides a personalized view into the performance and availability of the AWS services you are using, as well as alerts that are automatically triggered by changes in the health of those services. In addition to event-based alerts, Personal Health Dashboard provides proactive notifications of scheduled activities, such as any changes to the infrastructure powering your resources, enabling you to better plan for events that may affect you. These notifications can be delivered to you via email or mobile for quick visibility, and can always be viewed from within the AWS Management Console.When you get an alert, it includes detailed information and guidance, enabling you to take immediate action to address AWS events impacting your resources3. AWS Service Health Dashboard provides a general status of AWS services, and the Service health view displays the current and historical status of all AWS services. This page shows reported service events for services across AWS Regions. You don't need to sign in or have an AWS account to access the AWS Service Health Dashboard -- Service health page.You can also subscribe to RSS feeds for specific services or regions to receive notifications about service events4.Reference:Getting started with your AWS Health Dashboard -- Your account health,Introducing AWS Personal Health Dashboard

A company needs to migrate a PostgreSQL database from on-premises to Amazon RDS.

Which AWS service or tool should the company use to meet this requirement?

A.
Cloud Adoption Readiness Tool
A.
Cloud Adoption Readiness Tool
Answers
B.
AWS Migration Hub
B.
AWS Migration Hub
Answers
C.
AWS Database Migration Service (AWS DMS)
C.
AWS Database Migration Service (AWS DMS)
Answers
D.
AWS Application Migration Service
D.
AWS Application Migration Service
Answers
Suggested answer: C

Explanation:

AWS Database Migration Service (AWS DMS) is a managed and automated service that helps you migrate your databases from your on-premises or cloud environment to AWS, either as a one-time migration or as a continuous replication. AWS DMS supports migration between 20-plus database and analytics engines, such as PostgreSQL, Oracle, MySQL, SQL Server, MongoDB, Amazon Aurora, Amazon RDS, Amazon Redshift, and Amazon S3. AWS DMS also provides schema conversion and validation tools, as well as monitoring and security features.AWS DMS is a cost-effective and reliable solution for database migration, as you only pay for the compute resources and additional log storage used during the migration process, and you can minimize the downtime and data loss with Multi-AZ and ongoing replication12

To migrate a PostgreSQL database from on-premises to Amazon RDS using AWS DMS, you need to perform the following steps:

Create an AWS DMS replication instance in the same AWS Region as your target Amazon RDS PostgreSQL DB instance. The replication instance is a server that runs the AWS DMS replication software and connects to your source and target endpoints.You can choose the instance type, storage, and network settings based on your migration requirements3

Create a source endpoint that points to your on-premises PostgreSQL database. You need to provide the connection details, such as the server name, port, database name, user name, and password.You also need to specify the engine name as postgres and the SSL mode as required4

Create a target endpoint that points to your Amazon RDS PostgreSQL DB instance. You need to provide the connection details, such as the server name, port, database name, user name, and password. You also need to specify the engine name as postgres and the SSL mode as verify-full.

Create a migration task that defines the migration settings and options, such as the replication instance, the source and target endpoints, the migration type (full load, full load and change data capture, or change data capture only), the table mappings, the task settings, and the task monitoring role. You can also use the AWS Schema Conversion Tool (AWS SCT) to convert your source schema to the target schema and apply it to the target endpoint before or after creating the migration task.

Start the migration task and monitor its progress and status using the AWS DMS console, the AWS CLI, or the AWS DMS API. You can also use AWS CloudFormation to automate the creation and execution of the migration task.

The other options are not suitable for migrating a PostgreSQL database from on-premises to Amazon RDS. Cloud Adoption Readiness Tool is a tool that helps you assess your readiness for cloud adoption based on six dimensions: business, people, process, platform, operations, and security. It does not perform any database migration tasks. AWS Migration Hub is a service that helps you track and manage the progress of your application migrations across multiple AWS and partner services, such as AWS DMS, AWS Application Migration Service, AWS Server Migration Service, and CloudEndure Migration. It does not perform any database migration tasks itself, but rather integrates with other migration services. AWS Application Migration Service is a service that helps you migrate your applications from your on-premises or cloud environment to AWS without making any changes to the applications, their architecture, or the migrated servers. It does not support database migration, but rather replicates your servers as Amazon Machine Images (AMIs) and launches them as EC2 instances on AWS.

Which cloud concept is demonstrated by using AWS Compute Optimizer?

A.
Security validation
A.
Security validation
Answers
B.
Rightsizing
B.
Rightsizing
Answers
C.
Elasticity
C.
Elasticity
Answers
D.
Global reach
D.
Global reach
Answers
Suggested answer: B

Explanation:

Rightsizing is the cloud concept that is demonstrated by using AWS Compute Optimizer. Rightsizing is the process of adjusting the type and size of your cloud resources to match the optimal performance and cost for your workloads. AWS Compute Optimizer is a service that analyzes the configuration and utilization metrics of your AWS resources, such as Amazon EC2 instances, Amazon EBS volumes, AWS Lambda functions, and Amazon ECS services on AWS Fargate. It reports whether your resources are optimal, and generates optimization recommendations to reduce the cost and improve the performance of your workloads. AWS Compute Optimizer uses machine learning to analyze your historical utilization data and compare it with the most cost-effective AWS alternatives. You can use the recommendations to evaluate the trade-offs between cost and performance, and decide when to move or resize your resources to achieve the best results.Reference:Workload Rightsizing - AWS Compute Optimizer - AWS,What is AWS Compute Optimizer? - AWS Compute Optimizer

A company wants to migrate its on-premises relational databases to the AWS Cloud. The company wants to use infrastructure as close to its current geographical location as possible.

Which AWS service or resource should the company use to select its Amazon RDS deployment area?

A.
Amazon Connect
A.
Amazon Connect
Answers
B.
AWS Wavelength
B.
AWS Wavelength
Answers
C.
AWS Regions
C.
AWS Regions
Answers
D.
AWS Direct Connect
D.
AWS Direct Connect
Answers
Suggested answer: C

Explanation:

AWS Regions are the AWS service or resource that the company should use to select its Amazon RDS deployment area. AWS Regions are separate geographic areas where AWS clusters its data centers. Each AWS Region consists of multiple, isolated, and physically separate Availability Zones within a geographic area. Each AWS Region is designed to be isolated from the other AWS Regions to achieve the highest possible fault tolerance and stability. AWS provides a more extensive global footprint than any other cloud provider, and to support its global footprint and ensure customers are served across the world, AWS opens new Regions rapidly. AWS maintains multiple geographic Regions, including Regions in North America, South America, Europe, China, Asia Pacific, South Africa, and the Middle East. Amazon RDS is available in several AWS Regions worldwide. To create or work with an Amazon RDS DB instance in a specific AWS Region, you must use the corresponding regional service endpoint. You can choose the AWS Region that meets your latency or legal requirements. You can also use multiple AWS Regions to design a disaster recovery solution or to distribute your read workload.Reference:Global Infrastructure Regions & AZs - aws.amazon.com,Regions, Availability Zones, and Local Zones - Amazon Relational Database Service

A developer wants to deploy an application quickly on AWS without manually creating the required resources. Which AWS service will meet these requirements?

A.
Amazon EC2
A.
Amazon EC2
Answers
B.
AWS Elastic Beanstalk
B.
AWS Elastic Beanstalk
Answers
C.
AWS CodeBuild
C.
AWS CodeBuild
Answers
D.
Amazon Personalize
D.
Amazon Personalize
Answers
Suggested answer: B

Explanation:

AWS Elastic Beanstalk is a service that allows you to deploy and manage applications on AWS without manually creating and configuring the required resources, such as EC2 instances, load balancers, security groups, databases, and more. AWS Elastic Beanstalk automatically handles the provisioning, scaling, load balancing, health monitoring, and updating of your application, while giving you full control over the underlying AWS resources if needed. AWS Elastic Beanstalk supports a variety of platforms and languages, such as Java, .NET, PHP, Node.js, Python, Ruby, Go, and Docker. You can use the AWS Management Console, the AWS CLI, the AWS SDKs, or the AWS Elastic Beanstalk API to create and manage your applications.You can also use AWS CodeStar, AWS CodeCommit, AWS CodeBuild, AWS CodeDeploy, and AWS CodePipeline to integrate AWS Elastic Beanstalk with your development and deployment workflows12

Total 789 questions
Go to page: of 79