ExamGecko
Home Home / Amazon / SAA-C03

Amazon SAA-C03 Practice Test - Questions Answers, Page 14

Question list
Search
Search

List of questions

Search

Related questions











A company is developing a file-sharing application that will use an Amazon S3 bucket for storage. The company wants to serve all the files through an Amazon CloudFront distribution. The company does not want the files to be accessible through direct navigation to the S3 URL.

What should a solutions architect do to meet these requirements?

A.
Write individual policies for each S3 bucket to grant read permission for only CloudFront access.
A.
Write individual policies for each S3 bucket to grant read permission for only CloudFront access.
Answers
B.
Create an IAM user. Grant the user read permission to objects in the S3 bucket. Assign the user to CloudFront.
B.
Create an IAM user. Grant the user read permission to objects in the S3 bucket. Assign the user to CloudFront.
Answers
C.
Write an S3 bucket policy that assigns the CloudFront distribution ID as the Principal and assigns the target S3 bucket as the Amazon Resource Name (ARN).
C.
Write an S3 bucket policy that assigns the CloudFront distribution ID as the Principal and assigns the target S3 bucket as the Amazon Resource Name (ARN).
Answers
D.
Create an origin access identity (OAI). Assign the OAI to the CloudFront distribution. Configure the S3 bucket permissions so that only the OAI has read permission.
D.
Create an origin access identity (OAI). Assign the OAI to the CloudFront distribution. Configure the S3 bucket permissions so that only the OAI has read permission.
Answers
Suggested answer: D

Explanation:

https://aws.amazon.com/premiumsupport/knowledge-center/cloudfront-access-to-amazon-s3/ https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-contentrestricting- access-to-s3.html#private-content-restricting- access-to-s3-overview

A company’s website provides users with downloadable historical performance reports. The website needs a solution that will scale to meet the company’s website demands globally. The solution should be cost-effective, limit the provisioning of infrastructure resources, and provide the fastest possible response time.

Which combination should a solutions architect recommend to meet these requirements?

A.
Amazon CloudFront and Amazon S3
A.
Amazon CloudFront and Amazon S3
Answers
B.
AWS Lambda and Amazon DynamoDB
B.
AWS Lambda and Amazon DynamoDB
Answers
C.
Application Load Balancer with Amazon EC2 Auto Scaling
C.
Application Load Balancer with Amazon EC2 Auto Scaling
Answers
D.
Amazon Route 53 with internal Application Load Balancers
D.
Amazon Route 53 with internal Application Load Balancers
Answers
Suggested answer: A

Explanation:

Cloudfront for rapid response and s3 to minimize infrastructure.

A company runs an Oracle database on premises. As part of the company’s migration to AWS, the company wants to upgrade the database to the most recent available version. The company also wants to set up disaster recovery (DR) for the database. The company needs to minimize the operational overhead for normal operations and DR setup. The company also needs to maintain access to the database's underlying operating system. Which solution will meet these requirements?

A.
Migrate the Oracle database to an Amazon EC2 instance. Set up database replication to a different AWS Region.
A.
Migrate the Oracle database to an Amazon EC2 instance. Set up database replication to a different AWS Region.
Answers
B.
Migrate the Oracle database to Amazon RDS for Oracle. Activate Cross-Region automated backups to replicate the snapshots to another AWS Region.
B.
Migrate the Oracle database to Amazon RDS for Oracle. Activate Cross-Region automated backups to replicate the snapshots to another AWS Region.
Answers
C.
Migrate the Oracle database to Amazon RDS Custom for Oracle. Create a read replica for the database in another AWS Region.
C.
Migrate the Oracle database to Amazon RDS Custom for Oracle. Create a read replica for the database in another AWS Region.
Answers
D.
Migrate the Oracle database to Amazon RDS for Oracle. Create a standby database in another Availability Zone.
D.
Migrate the Oracle database to Amazon RDS for Oracle. Create a standby database in another Availability Zone.
Answers
Suggested answer: C

Explanation:

https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/rds-custom.html

and

https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/working-with-custom-oracle.html

A company wants to move its application to a serverless solution. The serverless solution needs to analyze existing and new data by using SL. The company stores the data in an Amazon S3 bucket. The data requires encryption and must be replicated to a different AWS Region.

Which solution will meet these requirements with the LEAST operational overhead?

A.
Create a new S3 bucket. Load the data into the new S3 bucket. Use S3 Cross-Region Replication (CRR) to replicate encrypted objects to an S3 bucket in another Region. Use server-side encryption with AWS KMS multi-Region kays (SSE-KMS). Use Amazon Athena to query the data.
A.
Create a new S3 bucket. Load the data into the new S3 bucket. Use S3 Cross-Region Replication (CRR) to replicate encrypted objects to an S3 bucket in another Region. Use server-side encryption with AWS KMS multi-Region kays (SSE-KMS). Use Amazon Athena to query the data.
Answers
B.
Create a new S3 bucket. Load the data into the new S3 bucket. Use S3 Cross-Region Replication (CRR) to replicate encrypted objects to an S3 bucket in another Region. Use server-side encryption with AWS KMS multi-Region keys (SSE-KMS). Use Amazon RDS to query the data.
B.
Create a new S3 bucket. Load the data into the new S3 bucket. Use S3 Cross-Region Replication (CRR) to replicate encrypted objects to an S3 bucket in another Region. Use server-side encryption with AWS KMS multi-Region keys (SSE-KMS). Use Amazon RDS to query the data.
Answers
C.
Load the data into the existing S3 bucket. Use S3 Cross-Region Replication (CRR) to replicate encrypted objects to an S3 bucket in another Region. Use server-side encryption with Amazon S3 managed encryption keys (SSE-S3). Use Amazon Athena to query the data.
C.
Load the data into the existing S3 bucket. Use S3 Cross-Region Replication (CRR) to replicate encrypted objects to an S3 bucket in another Region. Use server-side encryption with Amazon S3 managed encryption keys (SSE-S3). Use Amazon Athena to query the data.
Answers
D.
Load the data into the existing S3 bucket. Use S3 Cross-Region Replication (CRR) to replicate encrypted objects to an S3 bucket in another Region. Use server-side encryption with Amazon S3 managed encryption keys (SSE-S3). Use Amazon RDS to query the data.
D.
Load the data into the existing S3 bucket. Use S3 Cross-Region Replication (CRR) to replicate encrypted objects to an S3 bucket in another Region. Use server-side encryption with Amazon S3 managed encryption keys (SSE-S3). Use Amazon RDS to query the data.
Answers
Suggested answer: A

A company runs workloads on AWS. The company needs to connect to a service from an external provider. The service is hosted in the provider's VPC. According to the company’s security team, the connectivity must be private and must be restricted to the target service. The connection must be initiated only from the company’s VPC.

Which solution will mast these requirements?

A.
Create a VPC peering connection between the company's VPC and the provider's VPC. Update the route table to connect to the target service.
A.
Create a VPC peering connection between the company's VPC and the provider's VPC. Update the route table to connect to the target service.
Answers
B.
Ask the provider to create a virtual private gateway in its VPC. Use AWS PrivateLink to connect to the target service.
B.
Ask the provider to create a virtual private gateway in its VPC. Use AWS PrivateLink to connect to the target service.
Answers
C.
Create a NAT gateway in a public subnet of the company's VPC. Update the route table to connect to the target service.
C.
Create a NAT gateway in a public subnet of the company's VPC. Update the route table to connect to the target service.
Answers
D.
Ask the provider to create a VPC endpoint for the target service. Use AWS PrivateLink to connect to the target service.
D.
Ask the provider to create a VPC endpoint for the target service. Use AWS PrivateLink to connect to the target service.
Answers
Suggested answer: D

A company is migrating its on-premises PostgreSQL database to Amazon Aurora PostgreSQL. The onpremises database must remain online and accessible during the migration. The Aurora database must remain synchronized with the on- premises database.

Which combination of actions must a solutions architect take to meet these requirements? (Choose two.)

A.
Create an ongoing replication task.
A.
Create an ongoing replication task.
Answers
B.
Create a database backup of the on-premises database
B.
Create a database backup of the on-premises database
Answers
C.
Create an AWS Database Migration Service (AWS DMS) replication server
C.
Create an AWS Database Migration Service (AWS DMS) replication server
Answers
D.
Convert the database schema by using the AWS Schema Conversion Tool (AWS SCT).
D.
Convert the database schema by using the AWS Schema Conversion Tool (AWS SCT).
Answers
E.
Create an Amazon EventBridge (Amazon CloudWatch Events) rule to monitor the database synchronization
E.
Create an Amazon EventBridge (Amazon CloudWatch Events) rule to monitor the database synchronization
Answers
Suggested answer: A, C

A company uses AWS Organizations to create dedicated AWS accounts for each business unit to manage each business unit's account independently upon request. The root email recipient missed a notification that was sent to the root user email address of one account. The company wants to ensure that all future notifications are not missed. Future notifications must be limited to account administrators. Which solution will meet these requirements?

A.
Configure the company's email server to forward notification email messages that are sent to the AWS account root user email address to all users in the organization.
A.
Configure the company's email server to forward notification email messages that are sent to the AWS account root user email address to all users in the organization.
Answers
B.
Configure all AWS account root user email addresses as distribution lists that go to a few administrators who can respond to alerts. Configure AWS account alternate contacts in the AWS Organizations console or programmatically.
B.
Configure all AWS account root user email addresses as distribution lists that go to a few administrators who can respond to alerts. Configure AWS account alternate contacts in the AWS Organizations console or programmatically.
Answers
C.
Configure all AWS account root user email messages to be sent to one administrator who is responsible for monitoring alerts and forwarding those alerts to the appropriate groups.
C.
Configure all AWS account root user email messages to be sent to one administrator who is responsible for monitoring alerts and forwarding those alerts to the appropriate groups.
Answers
D.
Configure all existing AWS accounts and all newly created accounts to use the same root user email address. Configure AWS account alternate contacts in the AWS Organizations console or programmatically.
D.
Configure all existing AWS accounts and all newly created accounts to use the same root user email address. Configure AWS account alternate contacts in the AWS Organizations console or programmatically.
Answers
Suggested answer: B

Explanation:


A company runs its ecommerce application on AWS. Every new order is published as a message in a RabbitMQ queue that runs on an Amazon EC2 instance in a single Availability Zone. These messages are processed by a different application that runs on a separate EC2 instance. This application stores the details in a PostgreSQL database on another EC2 instance. All the EC2 instances are in the same Availability Zone. The company needs to redesign its architecture to provide the highest availability with the least operational overhead. What should a solutions architect do to meet these requirements?

A.
Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon MQ.Create a Multi-AZ Auto Scaling group (or EC2 instances that host the application. Create another Multi-AZ Auto Scaling group for EC2 instances that host the PostgreSQL database.
A.
Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon MQ.Create a Multi-AZ Auto Scaling group (or EC2 instances that host the application. Create another Multi-AZ Auto Scaling group for EC2 instances that host the PostgreSQL database.
Answers
B.
Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon MQ.Create a Multi-AZ Auto Scaling group for EC2 instances that host the application. Migrate the database to run on a Multi-AZ deployment of Amazon RDS for PostgreSQL.
B.
Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon MQ.Create a Multi-AZ Auto Scaling group for EC2 instances that host the application. Migrate the database to run on a Multi-AZ deployment of Amazon RDS for PostgreSQL.
Answers
C.
Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queue. Create another Multi-AZ Auto Scaling group for EC2 instances that host the application. Migrate the database to run on a Multi-AZ deployment of Amazon RDS fqjPostgreSQL.
C.
Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queue. Create another Multi-AZ Auto Scaling group for EC2 instances that host the application. Migrate the database to run on a Multi-AZ deployment of Amazon RDS fqjPostgreSQL.
Answers
D.
Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queue. Create another Multi-AZ Auto Scaling group for EC2 instances that host the application. Create a third Multi- AZ Auto Scaling group for EC2 instances that host the PostgreSQL database.
D.
Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queue. Create another Multi-AZ Auto Scaling group for EC2 instances that host the application. Create a third Multi- AZ Auto Scaling group for EC2 instances that host the PostgreSQL database.
Answers
Suggested answer: B

Explanation:

Migrating to Amazon MQ reduces the overhead on the queue management. C and D are dismissed. Deciding between A and B means deciding to go for an AutoScaling group for EC2 or an RDS for

Postgress (both multi- AZ). The RDS option has less operational impact, as provide as a service the tools and software required. Consider for instance, the effort to add an additional node like a read replica, to the DB. https://docs.aws.amazon.com/amazon-mq/latest/developer-guide/active-standby-broker- deployment.html https://aws.amazon.com/rds/postgresql/


A reporting team receives files each day in an Amazon S3 bucket. The reporting team manually reviews and copies the files from this initial S3 bucket to an analysis S3 bucket each day at the same time to use with Amazon QuickSight. Additional teams are starting to send more files in larger sizes to the initial S3 bucket.

The reporting team wants to move the files automatically analysis S3 bucket as the files enter the initial S3 bucket. The reporting team also wants to use AWS Lambda functions to run patternmatching code on the copied dat a. In addition, the reporting team wants to send the data files to a pipeline in Amazon SageMaker Pipelines.

What should a solutions architect do to meet these requirements with the LEAST operational overhead?

A.
Create a Lambda function to copy the files to the analysis S3 bucket. Create an S3 event notification for the analysis S3 bucket. Configure Lambda and SageMaker Pipelines as destinations of the event notification. Configure s30bjectCreated:Put as the event type.
A.
Create a Lambda function to copy the files to the analysis S3 bucket. Create an S3 event notification for the analysis S3 bucket. Configure Lambda and SageMaker Pipelines as destinations of the event notification. Configure s30bjectCreated:Put as the event type.
Answers
B.
Create a Lambda function to copy the files to the analysis S3 bucket. Configure the analysis S3 bucket to send event notifications to Amazon EventBridge (Amazon CloudWatch Events). Configure an ObjectCreated rule in EventBridge (CloudWatch Events). Configure Lambda and SageMaker Pipelines as targets for the rule.
B.
Create a Lambda function to copy the files to the analysis S3 bucket. Configure the analysis S3 bucket to send event notifications to Amazon EventBridge (Amazon CloudWatch Events). Configure an ObjectCreated rule in EventBridge (CloudWatch Events). Configure Lambda and SageMaker Pipelines as targets for the rule.
Answers
C.
Configure S3 replication between the S3 buckets. Create an S3 event notification for the analysis S3 bucket. Configure Lambda and SageMaker Pipelines as destinations of the event notification. Configure s30bjectCreated:Put as the event type.
C.
Configure S3 replication between the S3 buckets. Create an S3 event notification for the analysis S3 bucket. Configure Lambda and SageMaker Pipelines as destinations of the event notification. Configure s30bjectCreated:Put as the event type.
Answers
D.
Configure S3 replication between the S3 buckets. Configure the analysis S3 bucket to send event notifications to Amazon EventBridge (Amazon CloudWatch Events). Configure an ObjectCreated rule in EventBridge (CloudWatch Events). Configure Lambda and SageMaker Pipelines as targets for the rule.
D.
Configure S3 replication between the S3 buckets. Configure the analysis S3 bucket to send event notifications to Amazon EventBridge (Amazon CloudWatch Events). Configure an ObjectCreated rule in EventBridge (CloudWatch Events). Configure Lambda and SageMaker Pipelines as targets for the rule.
Answers
Suggested answer: D

Explanation:

This solution meets the requirements of moving the files automatically, running Lambda functions on the copied data, and sending the data files to SageMaker Pipelines with the least operational overhead. S3 replication can copy the files from the initial S3 bucket to the analysis S3 bucket as they arrive. The analysis S3 bucket can send event notifications to Amazon EventBridge (Amazon CloudWatch Events) when an object is created. EventBridge can trigger Lambda and SageMaker Pipelines as targets for the ObjectCreated rule. Lambda can run pattern-matching code on the copied data, and SageMaker Pipelines can execute a pipeline with the data files.Option A is incorrect because creating a Lambda function to copy the files to the analysis S3 bucket is not necessary when S3 replication can do that automatically. It also adds operational overhead to manage the Lambda function. Option B is incorrect because creating a Lambda function to copy the files to the analysis S3 bucket is not necessary when S3 replication can do that automatically. It also adds operational overhead to manage the Lambda function. Option C is incorrect because using S3 event notification with multiple destinations can result in throttling or delivery failures if there are too many events.Reference:https://aws.amazon.com/blogs/machine-learning/automate-feature-engineering-pipelines-with- amazon-sagemaker/ https://docs.aws.amazon.com/sagemaker/latest/dg/automating-sagemaker-with-eventbridge.html


A solutions architect needs to help a company optimize the cost of running an application on AWS.

The application will use Amazon EC2 instances, AWS Fargate, and AWS Lambda for compute within the architecture. The EC2 instances will run the data ingestion layer of the application. EC2 usage will be sporadic and unpredictable. Workloads that run on EC2 instances can be interrupted at any time. The application front end will run on Fargate, and Lambda will serve the API layer. The front-end utilization and API layer utilization will be predictable over the course of the next year. Which combination of purchasing options will provide the MOST cost-effective solution for hosting this application? (Choose two.)

A.
Use Spot Instances for the data ingestion layer
A.
Use Spot Instances for the data ingestion layer
Answers
B.
Use On-Demand Instances for the data ingestion layer
B.
Use On-Demand Instances for the data ingestion layer
Answers
C.
Purchase a 1-year Compute Savings Plan for the front end and API layer.
C.
Purchase a 1-year Compute Savings Plan for the front end and API layer.
Answers
D.
Purchase 1-year All Upfront Reserved instances for the data ingestion layer.
D.
Purchase 1-year All Upfront Reserved instances for the data ingestion layer.
Answers
E.
Purchase a 1-year EC2 instance Savings Plan for the front end and API layer.
E.
Purchase a 1-year EC2 instance Savings Plan for the front end and API layer.
Answers
Suggested answer: A, C
Total 886 questions
Go to page: of 89