ExamGecko
Home Home / Amazon / SAA-C03

Amazon SAA-C03 Practice Test - Questions Answers, Page 54

Question list
Search
Search

List of questions

Search

Related questions











A company runs its applications on Amazon EC2 instances that are backed by Amazon Elastic Block Store (Amazon EBS). The EC2 instances run the most recent Amazon Linux release. The applications are experiencing availability issues when the company's employees store and retrieve files that are 25 GB or larger. The company needs a solution that does not require the company to transfer files between EC2 instances. The files must be available across many EC2 instances and across multiple Availability Zones.

Which solution will meet these requirements?

A.
Migrate all the files to an Amazon S3 bucket. Instruct the employees to access the files from the S3 bucket.
A.
Migrate all the files to an Amazon S3 bucket. Instruct the employees to access the files from the S3 bucket.
Answers
B.
Take a snapshot of the existing EBS volume. Mount the snapshot as an EBS volume across the EC2 instances. Instruct the employees to access the files from the EC2 instances.
B.
Take a snapshot of the existing EBS volume. Mount the snapshot as an EBS volume across the EC2 instances. Instruct the employees to access the files from the EC2 instances.
Answers
C.
Mount an Amazon Elastic File System (Amazon EFS) file system across all the EC2 instances.Instruct the employees to access the files from the EC2 instances.
C.
Mount an Amazon Elastic File System (Amazon EFS) file system across all the EC2 instances.Instruct the employees to access the files from the EC2 instances.
Answers
D.
Create an Amazon Machine Image (AMI) from the EC2 instances. Configure new EC2 instances from the AMI that use an instance store volume. Instruct the employees to access the files from the EC2 instances
D.
Create an Amazon Machine Image (AMI) from the EC2 instances. Configure new EC2 instances from the AMI that use an instance store volume. Instruct the employees to access the files from the EC2 instances
Answers
Suggested answer: C

Explanation:

To store and access files that are 25 GB or larger across many EC2 instances and across multiple Availability Zones, Amazon Elastic File System (Amazon EFS) is a suitable solution. Amazon EFS provides a simple, scalable, elastic file system that can be mounted on multiple EC2 instances concurrently. Amazon EFS supports high availability and durability by storing data across multiple Availability Zones within a Region.

Reference:

What Is Amazon Elastic File System?

Using EFS with EC2

A solutions architect is using an AWS CloudFormation template to deploy a three-tier web application. The web application consists of a web tier and an application tier that stores and retrieves user data in Amazon DynamoDB tables. The web and application tiers are hosted on

Amazon EC2 instances, and the database tier is not publicly accessible. The application EC2 instances need to access the DynamoDB tables without exposing API credentials in the template.

What should the solutions architect do to meet these requirements?

A.
Create an IAM role to read the DynamoDB tables. Associate the role with the application instances by referencing an instance profile.
A.
Create an IAM role to read the DynamoDB tables. Associate the role with the application instances by referencing an instance profile.
Answers
B.
Create an IAM role that has the required permissions to read and write from the DynamoDB tables. Add the role to the EC2 instance profile, and associate the instance profile with the application instances.
B.
Create an IAM role that has the required permissions to read and write from the DynamoDB tables. Add the role to the EC2 instance profile, and associate the instance profile with the application instances.
Answers
C.
Use the parameter section in the AWS CloudFormation template to have the user input access and secret keys from an already-created IAM user that has the required permissions to read and write from the DynamoDB tables.
C.
Use the parameter section in the AWS CloudFormation template to have the user input access and secret keys from an already-created IAM user that has the required permissions to read and write from the DynamoDB tables.
Answers
D.
Create an IAM user in the AWS CloudFormation template that has the required permissions to read and write from the DynamoDB tables. Use the GetAtt function to retrieve the access and secret keys, and pass them to the application instances through the user data.
D.
Create an IAM user in the AWS CloudFormation template that has the required permissions to read and write from the DynamoDB tables. Use the GetAtt function to retrieve the access and secret keys, and pass them to the application instances through the user data.
Answers
Suggested answer: B

Explanation:

it allows the application EC2 instances to access the DynamoDB tables without exposing API credentials in the template. By creating an IAM role that has the required permissions to read and write from the DynamoDB tables and adding it to the EC2 instance profile, the application instances can use temporary security credentials that are automatically rotated by AWS. This is a secure and best practice way to grant access to AWS resources from EC2 instances. Reference:

IAM Roles for Amazon EC2 Using Instance Profiles

A company sends AWS CloudTrail logs from multiple AWS accounts to an Amazon S3 bucket in a centralized account. The company must keep the CloudTrail logs. The company must also be able to query the CloudTrail logs at any time Which solution will meet these requirements?

A.
Use the CloudTraiI event history in the centralized account to create an Amazon Athena table.Query the CloudTrail logs from Athena.
A.
Use the CloudTraiI event history in the centralized account to create an Amazon Athena table.Query the CloudTrail logs from Athena.
Answers
B.
Configure an Amazon Neptune instance to manage the CloudTrail logs. Query the CloudTraiI logs from Neptune.
B.
Configure an Amazon Neptune instance to manage the CloudTrail logs. Query the CloudTraiI logs from Neptune.
Answers
C.
Configure CloudTrail to send the logs to an Amazon DynamoDB table. Create a dashboard in Amazon QulCkSight to query the logs in the table.
C.
Configure CloudTrail to send the logs to an Amazon DynamoDB table. Create a dashboard in Amazon QulCkSight to query the logs in the table.
Answers
D.
use Amazon Athena to create an Athena notebook. Configure CloudTrail to send the logs to the notebook. Run queries from Athena.
D.
use Amazon Athena to create an Athena notebook. Configure CloudTrail to send the logs to the notebook. Run queries from Athena.
Answers
Suggested answer: A

Explanation:

it allows the company to keep the CloudTrail logs and query them at any time. By using the CloudTrail event history in the centralized account, the company can view, filter, and download recent API activity across multiple AWS accounts. By creating an Amazon Athena table from the CloudTrail event history, the company can use a serverless interactive query service that makes it easy to analyze data in S3 using standard SQL. By querying the CloudTrail logs from Athena, the company can gain insights into user activity and resource changes. Reference:

Viewing Events with CloudTrail Event History

Querying AWS CloudTrail Logs

Amazon Athena

A company has five organizational units (OUs) as part of its organization in AWS Organizations. Each OU correlates to the five businesses that the company owns. The company's research and development (R&D) business is separating from the company and will need its own organization. A solutions architect creates a separate new management account for this purpose.

What should the solutions architect do next in the new management account?

A.
Have the R&D AWS account be part of both organizations during the transition.
A.
Have the R&D AWS account be part of both organizations during the transition.
Answers
B.
Invite the R&D AWS account to be part of the new organization after the R&D AWS account has left the prior organization.
B.
Invite the R&D AWS account to be part of the new organization after the R&D AWS account has left the prior organization.
Answers
C.
Create a new R&D AWS account in the new organization. Migrate resources from the prior R&D AWS account to the new R&D AWS account.
C.
Create a new R&D AWS account in the new organization. Migrate resources from the prior R&D AWS account to the new R&D AWS account.
Answers
D.
Have the R&D AWS account join the new organization. Make the new management account a member of the prior organization.
D.
Have the R&D AWS account join the new organization. Make the new management account a member of the prior organization.
Answers
Suggested answer: B

Explanation:

it allows the solutions architect to create a separate organization for the research and development (R&D) business and move its AWS account to the new organization. By inviting the R&D AWS account to be part of the new organization after it has left the prior organization, the solutions architect can ensure that there is no overlap or conflict between the two organizations. The R&D AWS account can accept or decline the invitation to join the new organization. Once accepted, it will be subject to any policies and controls applied by the new organization. Reference:

Inviting an AWS Account to Join Your Organization

Leaving an Organization as a Member Account

A company is deploying an application that processes large quantities of data in parallel. The company plans to use Amazon EC2 instances for the workload. The network architecture must be configurable to prevent groups of nodes from sharing the same underlying hardware.

Which networking solution meets these requirements?

A.
Run the EC2 instances in a spread placement group.
A.
Run the EC2 instances in a spread placement group.
Answers
B.
Group the EC2 instances in separate accounts.
B.
Group the EC2 instances in separate accounts.
Answers
C.
Configure the EC2 instances with dedicated tenancy.
C.
Configure the EC2 instances with dedicated tenancy.
Answers
D.
Configure the EC2 instances with shared tenancy.
D.
Configure the EC2 instances with shared tenancy.
Answers
Suggested answer: A

Explanation:

it allows the company to deploy an application that processes large quantities of data in parallel and prevent groups of nodes from sharing the same underlying hardware. By running the EC2 instances in a spread placement group, the company can launch a small number of instances across distinct underlying hardware to reduce correlated failures. A spread placement group ensures that each instance is isolated from each other at the rack level. Reference:

Placement Groups

Spread Placement Groups

A solutions architect is creating a new Amazon CloudFront distribution for an application. Some of the information submitted by users is sensitive. The application uses HTTPS but needs another layer of security. The sensitive information should.be protected throughout the entire application stack, and access to the information should be restricted to certain applications.

Which action should the solutions architect take?

A.
Configure a CloudFront signed URL.
A.
Configure a CloudFront signed URL.
Answers
B.
Configure a CloudFront signed cookie.
B.
Configure a CloudFront signed cookie.
Answers
C.
Configure a CloudFront field-level encryption profile.
C.
Configure a CloudFront field-level encryption profile.
Answers
D.
Configure CloudFront and set the Origin Protocol Policy setting to HTTPS Only for the Viewer Protocol Policy.
D.
Configure CloudFront and set the Origin Protocol Policy setting to HTTPS Only for the Viewer Protocol Policy.
Answers
Suggested answer: C

Explanation:

it allows the company to protect sensitive information submitted by users throughout the entire application stack and restrict access to certain applications. By configuring a CloudFront field-level encryption profile, the company can encrypt specific fields of user data at the edge locations before sending it to the origin servers. By using public-private key pairs, the company can ensure that only authorized applications can decrypt and access the sensitive information. Reference:

Field-Level Encryption

Encrypting and Decrypting Data

A company runs its applications on Amazon EC2 instances. The company performs periodic financial assessments of itsAWS costs. The company recently identified unusual spending.

The company needs a solution to prevent unusual spending. The solution must monitor costs and notify responsible stakeholders in the event of unusual spending.

Which solution will meet these requirements?

A.
Use an AWS Budgets template to create a zero spend budget
A.
Use an AWS Budgets template to create a zero spend budget
Answers
B.
Create an AWS Cost Anomaly Detection monitor in the AWS Billing and Cost Management console.
B.
Create an AWS Cost Anomaly Detection monitor in the AWS Billing and Cost Management console.
Answers
C.
CreateAWS Pricing Calculator estimates for the current running workload pricing details_
C.
CreateAWS Pricing Calculator estimates for the current running workload pricing details_
Answers
D.
Use Amazon CloudWatch to monitor costs and to identify unusual spending
D.
Use Amazon CloudWatch to monitor costs and to identify unusual spending
Answers
Suggested answer: B

Explanation:

it allows the company to monitor costs and notify responsible stakeholders in the event of unusual spending. By creating an AWS Cost Anomaly Detection monitor in the AWS Billing and Cost Management console, the company can use a machine learning service that automatically detects and alerts on anomalous spend. By configuring alert thresholds, notification preferences, and root cause analysis, the company can prevent unusual spending and identify its source. Reference:

AWS Cost Anomaly Detection

Creating a Cost Anomaly Monitor

A recent analysis of a company's IT expenses highlights the need to reduce backup costs. The company's chief information officer wants to simplify the on- premises backup infrastructure and reduce costs by eliminating the use of physical backup tapes. The company must preserve the existing investment in the on- premises backup applications and workflows.

What should a solutions architect recommend?

A.
Set up AWS Storage Gateway to connect with the backup applications using the NFS interface.
A.
Set up AWS Storage Gateway to connect with the backup applications using the NFS interface.
Answers
B.
Set up an Amazon EFS file system that connects with the backup applications using the NFS interface.
B.
Set up an Amazon EFS file system that connects with the backup applications using the NFS interface.
Answers
C.
Set up an Amazon EFS file system that connects with the backup applications using the iSCSI interface.
C.
Set up an Amazon EFS file system that connects with the backup applications using the iSCSI interface.
Answers
D.
Set up AWS Storage Gateway to connect with the backup applications using the iSCSI-virtual tape library (VTL) interface.
D.
Set up AWS Storage Gateway to connect with the backup applications using the iSCSI-virtual tape library (VTL) interface.
Answers
Suggested answer: D

Explanation:

it allows the company to simplify the on-premises backup infrastructure and reduce costs by eliminating the use of physical backup tapes. By setting up AWS Storage Gateway to connect with the backup applications using the iSCSI-virtual tape library (VTL) interface, the company can store backup data on virtual tapes in S3 or Glacier. This preserves the existing investment in the on-premises backup applications and workflows while leveraging AWS storage services. Reference:

AWS Storage Gateway

Tape Gateway

A company is moving its data and applications to AWS during a multiyear migration project. The company wants to securely access data on Amazon S3 from the company's AWS Region and from the company's on-premises location. The data must not traverse the internet. The company has established an AWS Direct Connect connection between its Region and its on-premises location Which solution will meet these requirements?

A.
Create gateway endpoints for Amazon S3. Use the gateway endpoints to securely access the data from the Region and the on-premises location.
A.
Create gateway endpoints for Amazon S3. Use the gateway endpoints to securely access the data from the Region and the on-premises location.
Answers
B.
Create a gateway in AWS Transit Gateway to access Amazon S3 securely from the Region and the on-premises location.
B.
Create a gateway in AWS Transit Gateway to access Amazon S3 securely from the Region and the on-premises location.
Answers
C.
Create interface endpoints for Amazon S3_ Use the interface endpoints to securely access the data from the Region and the on-premises location.
C.
Create interface endpoints for Amazon S3_ Use the interface endpoints to securely access the data from the Region and the on-premises location.
Answers
D.
Use an AWS Key Management Service (AWS KMS) key to access the data securely from the Region and the on-premises location.
D.
Use an AWS Key Management Service (AWS KMS) key to access the data securely from the Region and the on-premises location.
Answers
Suggested answer: B

Explanation:

A gateway endpoint is a gateway that is a target for a specified route in your route table, used for traffic destined to a supported AWS service1. Amazon S3 does not support gateway endpoints, only interface endpoints2. Therefore, option A is incorrect.

An interface endpoint is an elastic network interface with a private IP address that serves as an entry point for traffic destined to a supported service1. An interface endpoint can provide secure access to Amazon S3 from within the Region, but not from the on-premises location. Therefore, option C is incorrect.

AWS Key Management Service (AWS KMS) is a service that allows you to create and manage encryption keys to protect your data3. AWS KMS does not provide a way to access data on Amazon S3 without traversing the internet. Therefore, option D is incorrect.

AWS Transit Gateway is a service that enables you to connect your Amazon Virtual Private Clouds (VPCs) and your on-premises networks to a single gateway. You can create a gateway in AWS Transit Gateway to access Amazon S3 securely from both the Region and the on-premises location using AWS Direct Connect. Therefore, option B is correct.

A company's website handles millions of requests each day, and the number of requests continues to increase. A solutions architect needs to improve the response time of the web application. The solutions architect determines that the application needs to decrease latency when retrieving product details from the Amazon DynamoDB table.

Which solution will meet these requirements with the LEAST amount of operational overhead?

A.
Set up a DynamoDB Accelerator (DAX) cluster. Route all read requests through DAX.
A.
Set up a DynamoDB Accelerator (DAX) cluster. Route all read requests through DAX.
Answers
B.
Set up Amazon ElastiCache for Redis between the DynamoDB table and the web application.Route all read requests through Redis.
B.
Set up Amazon ElastiCache for Redis between the DynamoDB table and the web application.Route all read requests through Redis.
Answers
C.
Set up Amazon ElastiCache for Memcached between the DynamoDB table and the web application. Route all read requests through Memcached.
C.
Set up Amazon ElastiCache for Memcached between the DynamoDB table and the web application. Route all read requests through Memcached.
Answers
D.
Set up Amazon DynamoDB Streams on the table, and have AWS Lambda read from the table and populate Amazon ElastiCache. Route all read requests through ElastiCache.
D.
Set up Amazon DynamoDB Streams on the table, and have AWS Lambda read from the table and populate Amazon ElastiCache. Route all read requests through ElastiCache.
Answers
Suggested answer: A

Explanation:

it allows the company to improve the response time of the web application and decrease latency when retrieving product details from the Amazon DynamoDB table. By setting up a DynamoDB Accelerator (DAX) cluster, the company can use a fully managed, highly available, in-memory cache for DynamoDB that delivers up to a 10x performance improvement. By routing all read requests through DAX, the company can reduce the number of read operations on the DynamoDB table and improve the user experience. Reference:

Amazon DynamoDB Accelerator (DAX)

Using DAX with DynamoDB

Total 886 questions
Go to page: of 89