ExamGecko
Home Home / Amazon / SAA-C03

Amazon SAA-C03 Practice Test - Questions Answers, Page 8

Question list
Search
Search

List of questions

Search

Related questions











A company runs a shopping application that uses Amazon DynamoDB to store customer information.

In case of data corruption, a solutions architect needs to design a solution that meets a recovery point objective (RPO) of 15 minutes and a recovery time objective (RTO) of 1 hour. What should the solutions architect recommend to meet these requirements?

A.
Configure DynamoDB global tables. For RPO recovery, point the application to a different AWS Region.
A.
Configure DynamoDB global tables. For RPO recovery, point the application to a different AWS Region.
Answers
B.
Configure DynamoDB point-in-time recovery. For RPO recovery, restore to the desired point in time.
B.
Configure DynamoDB point-in-time recovery. For RPO recovery, restore to the desired point in time.
Answers
C.
Export the DynamoDB data to Amazon S3 Glacier on a daily basis. For RPO recovery, import the data from S3 Glacier to DynamoDB.
C.
Export the DynamoDB data to Amazon S3 Glacier on a daily basis. For RPO recovery, import the data from S3 Glacier to DynamoDB.
Answers
D.
Schedule Amazon Elastic Block Store (Amazon EBS) snapshots for the DynamoDB table every 15 minutes. For RPO recovery, restore the DynamoDB table by using the EBS snapshot.
D.
Schedule Amazon Elastic Block Store (Amazon EBS) snapshots for the DynamoDB table every 15 minutes. For RPO recovery, restore the DynamoDB table by using the EBS snapshot.
Answers
Suggested answer: B

Explanation:

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/PointInTimeRecovery.html

A company runs a photo processing application that needs to frequently upload and download pictures from Amazon S3 buckets that are located in the same AWS Region. A solutions architect has noticed an increased cost in data transfer fees and needs to implement a solution to reduce these costs.

How can the solutions architect meet this requirement?

A.
Deploy Amazon API Gateway into a public subnet and adjust the route table to route S3 calls through It.
A.
Deploy Amazon API Gateway into a public subnet and adjust the route table to route S3 calls through It.
Answers
B.
Deploy a NAT gateway into a public subnet and attach an end point policy that allows access to the S3 buckets.
B.
Deploy a NAT gateway into a public subnet and attach an end point policy that allows access to the S3 buckets.
Answers
C.
Deploy the application Into a public subnet and allow it to route through an internet gateway to access the S3 Buckets
C.
Deploy the application Into a public subnet and allow it to route through an internet gateway to access the S3 Buckets
Answers
D.
Deploy an S3 VPC gateway endpoint into the VPC and attach an endpoint policy that allows access to the S3 buckets.
D.
Deploy an S3 VPC gateway endpoint into the VPC and attach an endpoint policy that allows access to the S3 buckets.
Answers
Suggested answer: D

Explanation:


A company recently launched Linux-based application instances on Amazon EC2 in a private subnet and launched a Linux-based bastion host on an Amazon EC2 instance in a public subnet of a VPC A solutions architect needs to connect from the on-premises network, through the company's internet connection to the bastion host and to the application servers The solutions architect must make sure that the security groups of all the EC2 instances will allow that access Which combination of steps should the solutions architect take to meet these requirements? (Select TWO)

A.
Replace the current security group of the bastion host with one that only allows inbound access from the application instances
A.
Replace the current security group of the bastion host with one that only allows inbound access from the application instances
Answers
B.
Replace the current security group of the bastion host with one that only allows inbound access from the internal IP range for the company
B.
Replace the current security group of the bastion host with one that only allows inbound access from the internal IP range for the company
Answers
C.
Replace the current security group of the bastion host with one that only allows inbound access from the external IP range for the company
C.
Replace the current security group of the bastion host with one that only allows inbound access from the external IP range for the company
Answers
D.
Replace the current security group of the application instances with one that allows inbound SSH access from only the private IP address of the bastion host
D.
Replace the current security group of the application instances with one that allows inbound SSH access from only the private IP address of the bastion host
Answers
E.
Replace the current security group of the application instances with one that allows inbound SSH access from only the public IP address of the bastion host
E.
Replace the current security group of the application instances with one that allows inbound SSH access from only the public IP address of the bastion host
Answers
Suggested answer: C, D

Explanation:

https://digitalcloud.training/ssh-into-ec2-in-private-subnet/

A solutions architect is designing a two-tier web application The application consists of a publicfacing web tier hosted on Amazon EC2 in public subnets The database tier consists of Microsoft SQL Server running on Amazon EC2 in a private subnet Security is a high priority for the company How should security groups be configured in this situation? (Select TWO )

A.
Configure the security group for the web tier to allow inbound traffic on port 443 from 0.0.0.0/0.
A.
Configure the security group for the web tier to allow inbound traffic on port 443 from 0.0.0.0/0.
Answers
B.
Configure the security group for the web tier to allow outbound traffic on port 443 from 0.0.0.0/0.
B.
Configure the security group for the web tier to allow outbound traffic on port 443 from 0.0.0.0/0.
Answers
C.
Configure the security group for the database tier to allow inbound traffic on port 1433 from the security group for the web tier.
C.
Configure the security group for the database tier to allow inbound traffic on port 1433 from the security group for the web tier.
Answers
D.
Configure the security group for the database tier to allow outbound traffic on ports 443 and 1433 to the security group for the web tier.
D.
Configure the security group for the database tier to allow outbound traffic on ports 443 and 1433 to the security group for the web tier.
Answers
E.
Configure the security group for the database tier to allow inbound traffic on ports 443 and 1433 from the security group for the web tier.
E.
Configure the security group for the database tier to allow inbound traffic on ports 443 and 1433 from the security group for the web tier.
Answers
Suggested answer: A, C

Explanation:

https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/security-group-rules-reference.html

A company wants to move a multi-tiered application from on premises to the AWS Cloud to improve the application's performance. The application consists of application tiers that communicate with each other by way of RESTful services. Transactions are dropped when one tier becomes overloaded.

A solutions architect must design a solution that resolves these issues and modernizes the application. Which solution meets these requirements and is the MOST operationally efficient?

A.
Use Amazon API Gateway and direct transactions to the AWS Lambda functions as the application layer. Use Amazon Simple Queue Service (Amazon SQS) as the communication layer between application services. Most Voted
A.
Use Amazon API Gateway and direct transactions to the AWS Lambda functions as the application layer. Use Amazon Simple Queue Service (Amazon SQS) as the communication layer between application services. Most Voted
Answers
B.
Use Amazon CloudWatch metrics to analyze the application performance history to determine the server's peak utilization during the performance failures. Increase the size of the application server's Amazon EC2 instances to meet the peak requirements.
B.
Use Amazon CloudWatch metrics to analyze the application performance history to determine the server's peak utilization during the performance failures. Increase the size of the application server's Amazon EC2 instances to meet the peak requirements.
Answers
C.
Use Amazon Simple Notification Service (Amazon SNS) to handle the messaging between application servers running on Amazon EC2 in an Auto Scaling group. Use Amazon CloudWatch to monitor the SNS queue length and scale up and down as required.
C.
Use Amazon Simple Notification Service (Amazon SNS) to handle the messaging between application servers running on Amazon EC2 in an Auto Scaling group. Use Amazon CloudWatch to monitor the SNS queue length and scale up and down as required.
Answers
D.
Use Amazon Simple Queue Service (Amazon SQS) to handle the messaging between application servers running on Amazon EC2 in an Auto Scaling group. Use Amazon CloudWatch to monitor the SQS queue length and scale up when communication failures are detected.
D.
Use Amazon Simple Queue Service (Amazon SQS) to handle the messaging between application servers running on Amazon EC2 in an Auto Scaling group. Use Amazon CloudWatch to monitor the SQS queue length and scale up when communication failures are detected.
Answers
Suggested answer: A

Explanation:

https://aws.amazon.com/getting-started/hands-on/build-serverless-web-app-lambda-apigateways3- dynamodb-cognito/module-4/ Build a Serverless Web Application with AWS Lambda, Amazon API Gateway, AWS Amplify, Amazon DynamoDB, and Amazon Cognito. This example showed similar setup as question: Build a Serverless Web Application with AWS Lambda, Amazon API Gateway, AWS Amplify, Amazon DynamoDB, and Amazon Cognito

A company receives 10 TB of instrumentation data each day from several machines located at a single factory. The data consists of JSON files stored on a storage area network (SAN) in an onpremises data center located within the factory. The company wants to send this data to Amazon S3 where it can be accessed by several additional systems that provide critical near-real-lime analytics. A secure transfer is important because the data is considered sensitive.

Which solution offers the MOST reliable data transfer?

A.
AWS DataSync over public internet
A.
AWS DataSync over public internet
Answers
B.
AWS DataSync over AWS Direct Connect
B.
AWS DataSync over AWS Direct Connect
Answers
C.
AWS Database Migration Service (AWS DMS) over public internet
C.
AWS Database Migration Service (AWS DMS) over public internet
Answers
D.
AWS Database Migration Service (AWS DMS) over AWS Direct Connect
D.
AWS Database Migration Service (AWS DMS) over AWS Direct Connect
Answers
Suggested answer: B

Explanation:

These are some of the main use cases for AWS DataSync: • Data migration – Move active datasets rapidly over the network into Amazon S3, Amazon EFS, or FSx for Windows File Server. DataSync includes automatic encryption and data integrity validation to help make sure that your data arrives securely, intact, and ready to use. "DataSync includes encryption and integrity validation to help make sure your data arrives securely, intact, and ready to use." https://aws.amazon.com/datasync/faqs/

A company needs to configure a real-time data ingestion architecture for its application. The company needs an API, a process that transforms data as the data is streamed, and a storage solution for the data. Which solution will meet these requirements with the LEAST operational overhead?

A.
Deploy an Amazon EC2 instance to host an API that sends data to an Amazon Kinesis data stream.Create an Amazon Kinesis Data Firehose delivery stream that uses the Kinesis data stream as a data source. Use AWS Lambda functions to transform the data. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3.
A.
Deploy an Amazon EC2 instance to host an API that sends data to an Amazon Kinesis data stream.Create an Amazon Kinesis Data Firehose delivery stream that uses the Kinesis data stream as a data source. Use AWS Lambda functions to transform the data. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3.
Answers
B.
Deploy an Amazon EC2 instance to host an API that sends data to AWS Glue. Stop source/destination checking on the EC2 instance. Use AWS Glue to transform the data and to send the data to Amazon S3.
B.
Deploy an Amazon EC2 instance to host an API that sends data to AWS Glue. Stop source/destination checking on the EC2 instance. Use AWS Glue to transform the data and to send the data to Amazon S3.
Answers
C.
Configure an Amazon API Gateway API to send data to an Amazon Kinesis data stream. Create an Amazon Kinesis Data Firehose delivery stream that uses the Kinesis data stream as a data source. Use AWS Lambda functions to transform the data. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3.
C.
Configure an Amazon API Gateway API to send data to an Amazon Kinesis data stream. Create an Amazon Kinesis Data Firehose delivery stream that uses the Kinesis data stream as a data source. Use AWS Lambda functions to transform the data. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3.
Answers
D.
Configure an Amazon API Gateway API to send data to AWS Glue. Use AWS Lambda functions to transform the data. Use AWS Glue to send the data to Amazon S3.
D.
Configure an Amazon API Gateway API to send data to AWS Glue. Use AWS Lambda functions to transform the data. Use AWS Glue to send the data to Amazon S3.
Answers
Suggested answer: C

A company needs to keep user transaction data in an Amazon DynamoDB table.

The company must retain the data for 7 years.

What is the MOST operationally efficient solution that meets these requirements?

A.
Use DynamoDB point-in-time recovery to back up the table continuously.
A.
Use DynamoDB point-in-time recovery to back up the table continuously.
Answers
B.
Use AWS Backup to create backup schedules and retention policies for the table.
B.
Use AWS Backup to create backup schedules and retention policies for the table.
Answers
C.
Create an on-demand backup of the table by using the DynamoDB console. Store the backup in an Amazon S3 bucket. Set an S3 Lifecycle configuration for the S3 bucket.
C.
Create an on-demand backup of the table by using the DynamoDB console. Store the backup in an Amazon S3 bucket. Set an S3 Lifecycle configuration for the S3 bucket.
Answers
D.
Create an Amazon EventBridge (Amazon CloudWatch Events) rule to invoke an AWS Lambda function. Configure the Lambda function to back up the table and to store the backup in an Amazon S3 bucket. Set an S3 Lifecycle configuration for the S3 bucket.
D.
Create an Amazon EventBridge (Amazon CloudWatch Events) rule to invoke an AWS Lambda function. Configure the Lambda function to back up the table and to store the backup in an Amazon S3 bucket. Set an S3 Lifecycle configuration for the S3 bucket.
Answers
Suggested answer: B

A company is planning to use an Amazon DynamoDB table for data storage. The company is concerned about cost optimization. The table will not be used on most mornings. In the evenings, the read and write traffic will often be unpredictable. When traffic spikes occur, they will happen very quickly.

What should a solutions architect recommend?

A.
Create a DynamoDB table in on-demand capacity mode.
A.
Create a DynamoDB table in on-demand capacity mode.
Answers
B.
Create a DynamoDB table with a global secondary index.
B.
Create a DynamoDB table with a global secondary index.
Answers
C.
Create a DynamoDB table with provisioned capacity and auto scaling.
C.
Create a DynamoDB table with provisioned capacity and auto scaling.
Answers
D.
Create a DynamoDB table in provisioned capacity mode, and configure it as a global table.
D.
Create a DynamoDB table in provisioned capacity mode, and configure it as a global table.
Answers
Suggested answer: A

A company recently signed a contract with an AWS Managed Service Provider (MSP) Partner for help with an application migration initiative. A solutions architect needs to share an Amazon Machine Image (AMI) from an existing AWS account with the MSP Partner's AWS account. The AMI is backed by Amazon Elastic Block Store (Amazon EBS) and uses a customer managed customer master key (CMK) to encrypt EBS volume snapshots. What is the MOST secure way for the solutions architect to share the AMI with the MSP Partner's AWS account?

A.
Make the encrypted AMI and snapshots publicly available. Modify the CMK's key policy to allow the MSP Partner's AWS account to use the key
A.
Make the encrypted AMI and snapshots publicly available. Modify the CMK's key policy to allow the MSP Partner's AWS account to use the key
Answers
B.
Modify the launchPermission property of the AMI. Share the AMI with the MSP Partner's AWS account only. Modify the CMK's key policy to allow the MSP Partner's AWS account to use the key.
B.
Modify the launchPermission property of the AMI. Share the AMI with the MSP Partner's AWS account only. Modify the CMK's key policy to allow the MSP Partner's AWS account to use the key.
Answers
C.
Modify the launchPermission property of the AMI Share the AMI with the MSP Partner's AWS account only. Modify the CMK's key policy to trust a new CMK that is owned by the MSP Partner for encryption.
C.
Modify the launchPermission property of the AMI Share the AMI with the MSP Partner's AWS account only. Modify the CMK's key policy to trust a new CMK that is owned by the MSP Partner for encryption.
Answers
D.
Export the AMI from the source account to an Amazon S3 bucket in the MSP Partner's AWS account. Encrypt the S3 bucket with a CMK that is owned by the MSP Partner Copy and launch the AMI in the MSP Partner's AWS account.
D.
Export the AMI from the source account to an Amazon S3 bucket in the MSP Partner's AWS account. Encrypt the S3 bucket with a CMK that is owned by the MSP Partner Copy and launch the AMI in the MSP Partner's AWS account.
Answers
Suggested answer: B

Explanation:

Share the existing KMS key with the MSP external account because it has already been used toencrypt the AMI snapshot. https://docs.aws.amazon.com/kms/latest/developerguide/key-policy- modifying-external-accounts.html

Total 886 questions
Go to page: of 89