ExamGecko
Home Home / Amazon / SAA-C03

Amazon SAA-C03 Practice Test - Questions Answers, Page 22

Question list
Search
Search

List of questions

Search

Related questions











A company has a Microsoft NET application that runs on an on-premises Windows Server Trie application stores data by using an Oracle Database Standard Edition server The company is planning a migration to AWS and wants to minimize development changes while moving the application The

AWS application environment should be highly available. Which combination of actions should the company take to meet these requirements? (Select TWO )

A.
Refactor the application as serverless with AWS Lambda functions running NET Core
A.
Refactor the application as serverless with AWS Lambda functions running NET Core
Answers
B.
Rehost the application in AWS Elastic Beanstalk with the NET platform in a Muti-AZ deployment
B.
Rehost the application in AWS Elastic Beanstalk with the NET platform in a Muti-AZ deployment
Answers
C.
Replatform the application to run on Amazon EC2 with the Amazon Linux Amazon Machine Image (AMI)
C.
Replatform the application to run on Amazon EC2 with the Amazon Linux Amazon Machine Image (AMI)
Answers
D.
Use AWS Database Migration Service (AWS DMS) to migrate trom the Oracle database to Amazon DynamoDB in a Multi-AZ deployment
D.
Use AWS Database Migration Service (AWS DMS) to migrate trom the Oracle database to Amazon DynamoDB in a Multi-AZ deployment
Answers
E.
Use AWS Database Migration Service (AWS DMS) to migrate from the Oracle database to Oracle on Amazon RDS in a Multi-AZ deployment
E.
Use AWS Database Migration Service (AWS DMS) to migrate from the Oracle database to Oracle on Amazon RDS in a Multi-AZ deployment
Answers
Suggested answer: B, E

A rapidly growing ecommerce company is running its workloads in a single AWS Region. A solutions architect must create a disaster recovery (DR) strategy that includes a different AWS Region. The company wants its database to be up to date in the DR Region with the least possible latency. The remaining infrastructure in the DR Region needs to run at reduced capacity and must be able to scale up if necessary. Which solution will meet these requirements with the LOWEST recovery time objective (RTO)?

A.
Use an Amazon Aurora global database with a pilot light deployment
A.
Use an Amazon Aurora global database with a pilot light deployment
Answers
B.
Use an Amazon Aurora global database with a warm standby deployment
B.
Use an Amazon Aurora global database with a warm standby deployment
Answers
C.
Use an Amazon RDS Multi-AZ DB instance wilh a pilot light deployment
C.
Use an Amazon RDS Multi-AZ DB instance wilh a pilot light deployment
Answers
D.
Use an Amazon RDS Multi-AZ DB instance with a warm standby deployment
D.
Use an Amazon RDS Multi-AZ DB instance with a warm standby deployment
Answers
Suggested answer: B

Explanation:

https://docs.aws.amazon.com/whitepapers/latest/disaster-recovery-workloads-on-aws/disasterrecovery-options-in-the-cloud.html

A company's order system sends requests from clients to Amazon EC2 instances. The EC2 instances process the orders and then store the orders in a database on Amazon RDS. Users report that they must reprocess orders when the system fails. The company wants a resilient solution that can process orders automatically if a system outage occurs. What should a solutions architect do to meet these requirements?

A.
Move (he EC2 Instances into an Auto Scaling group Create an Amazon EventBhdge (Amazon CloudWatch Events) rule to target an Amazon Elastic Container Service (Amazon ECS) task
A.
Move (he EC2 Instances into an Auto Scaling group Create an Amazon EventBhdge (Amazon CloudWatch Events) rule to target an Amazon Elastic Container Service (Amazon ECS) task
Answers
B.
Move the EC2 instances into an Auto Scaling group behind an Application Load Balancer (ALB) Update the order system to send messages to the ALB endpoint.
B.
Move the EC2 instances into an Auto Scaling group behind an Application Load Balancer (ALB) Update the order system to send messages to the ALB endpoint.
Answers
C.
Move the EC2 instances into an Auto Scaling group Configure the order system to send messages to an Amazon Simple Queue Service (Amazon SQS) queue Configure the EC2 instances to consume messages from the queue
C.
Move the EC2 instances into an Auto Scaling group Configure the order system to send messages to an Amazon Simple Queue Service (Amazon SQS) queue Configure the EC2 instances to consume messages from the queue
Answers
D.
Create an Amazon Simple Notification Service (Amazon SNS) topic Create an AWS Lambda function, and subscribe the function to the SNS topic Configure the order system to send messages to the SNS topic Send a command to the EC2 instances to process the messages by using AWS Systems Manager Run Command
D.
Create an Amazon Simple Notification Service (Amazon SNS) topic Create an AWS Lambda function, and subscribe the function to the SNS topic Configure the order system to send messages to the SNS topic Send a command to the EC2 instances to process the messages by using AWS Systems Manager Run Command
Answers
Suggested answer: C

A company runs an application on a large fleet of Amazon EC2 instances. The application reads and write entries into an Amazon DynamoDB table. The size of the DynamoDB table continuously grows, but the application needs only data from the last 30 days. The company needs a solution that minimizes cost and development effort.

Which solution meets these requirements?

A.
Use an AWS CloudFormation template to deploy the complete solution. Redeploy the CloudFormation stack every 30 days, and delete the original stack.
A.
Use an AWS CloudFormation template to deploy the complete solution. Redeploy the CloudFormation stack every 30 days, and delete the original stack.
Answers
B.
Use an EC2 Instance that runs a monitoring application from AWS Marketplace Configure the monitoring application to use Amazon DynamoDB Streams to store the timestamp when a new item is created in the table Use a script that runs on the EC2 instance to delele items that have a timestamp that is older than 30 days
B.
Use an EC2 Instance that runs a monitoring application from AWS Marketplace Configure the monitoring application to use Amazon DynamoDB Streams to store the timestamp when a new item is created in the table Use a script that runs on the EC2 instance to delele items that have a timestamp that is older than 30 days
Answers
C.
Configure Amazon DynamoDB Streams to invoke an AWS Lambda function when a new item is created in the table Configure the Lambda function to delete items in the table that are older than 30 days
C.
Configure Amazon DynamoDB Streams to invoke an AWS Lambda function when a new item is created in the table Configure the Lambda function to delete items in the table that are older than 30 days
Answers
D.
Extend the application to add an attribute that has a value of the current timestamp plus 30 days to each new item that is created in the (able Configure DynamoDB to use the attribute as (he TTL attribute
D.
Extend the application to add an attribute that has a value of the current timestamp plus 30 days to each new item that is created in the (able Configure DynamoDB to use the attribute as (he TTL attribute
Answers
Suggested answer: D

Explanation:

Amazon DynamoDB Time to Live (TTL) allows you to define a per-item timestamp to determine when an item is no longer needed. Shortly after the date and time of the specified timestamp, DynamoDB deletes the item from your table without consuming any write throughput. TTL is provided at no extra cost as a means to reduce stored data volumes by retaining only the items that remain current for your workload's needs. TTL is useful if you store items that lose relevance after a specific time. The following are example TTL use cases:

Remove user or sensor data after one year of inactivity in an application.

Archive expired items to an Amazon S3 data lake via Amazon DynamoDB Streams and AWS Lambda.

Retain sensitive data for a certain amount of time according to contractual or regulatory obligations. https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/TTL.html

A company runs a containerized application on a Kubernetes cluster in an on-premises data center. The company is using a MongoDB database for data storage. The company wants to migrate some of these environments to AWS, but no code changes or deployment method changes are possible at this time. The company needs a solution that minimizes operational overhead. Which solution meets these requirements?

A.
Use Amazon Elastic Container Service (Amazon ECS) with Amazon EC2 worker nodes for compute and MongoOB on EC2 for data storage
A.
Use Amazon Elastic Container Service (Amazon ECS) with Amazon EC2 worker nodes for compute and MongoOB on EC2 for data storage
Answers
B.
Use Amazon Elastic Container Service (Amazon ECS) with AWS Fargate for compute and Amazon DynamoDB tor data storage
B.
Use Amazon Elastic Container Service (Amazon ECS) with AWS Fargate for compute and Amazon DynamoDB tor data storage
Answers
C.
Use Amazon Elastic Kubernetes Service (Amazon EKS) with Amazon EC2 worker nodes for compute and Amazon DynamoDB for data storage
C.
Use Amazon Elastic Kubernetes Service (Amazon EKS) with Amazon EC2 worker nodes for compute and Amazon DynamoDB for data storage
Answers
D.
Use Amazon Elastic Kubernetes Service (Amazon EKS) with AWS Fargate for compute and Amazon DocumentDB (with MongoDB compatibility) for data storage.
D.
Use Amazon Elastic Kubernetes Service (Amazon EKS) with AWS Fargate for compute and Amazon DocumentDB (with MongoDB compatibility) for data storage.
Answers
Suggested answer: D

Explanation:

Answer: D

Amazon DocumentDB (with MongoDB compatibility) is a fast, reliable, and fully managed database service. Amazon DocumentDB makes it easy to set up, operate, and scale MongoDB-compatible databases in the cloud. With Amazon DocumentDB, you can run the same application code and use the same drivers and tools that you use with MongoDB.

https://docs.aws.amazon.com/documentdb/latest/developerguide/what-is.html

A company selves a dynamic website from a flee! of Amazon EC2 instances behind an Application Load Balancer (ALB) The website needs to support multiple languages to serve customers around the world The website's architecture is running in the us-west-1 Region and is exhibiting high request latency tor users that are located in other parts of the world The website needs to serve requests quickly and efficiently regardless of a user's location However the company does not want to recreate the existing architecture across multiple Regions

What should a solutions architect do to meet these requirements?

A.
Replace the existing architecture with a website that is served from an Amazon S3 bucket Configure an Amazon CloudFront distribution with the S3 bucket as the origin Set the cache behavior settings to cache based on the Accept- Languege request header
A.
Replace the existing architecture with a website that is served from an Amazon S3 bucket Configure an Amazon CloudFront distribution with the S3 bucket as the origin Set the cache behavior settings to cache based on the Accept- Languege request header
Answers
B.
Configure an Amazon CloudFront distribution with the ALB as the origin Set the cache behavior settings to cache based on the Accept-Language request header
B.
Configure an Amazon CloudFront distribution with the ALB as the origin Set the cache behavior settings to cache based on the Accept-Language request header
Answers
C.
Create an Amazon API Gateway API that is integrated with the ALB Configure the API to use the HTTP integration type Set up an API Gateway stage to enable the API cache based on the AcceptLanguage request header
C.
Create an Amazon API Gateway API that is integrated with the ALB Configure the API to use the HTTP integration type Set up an API Gateway stage to enable the API cache based on the AcceptLanguage request header
Answers
D.
Launch an EC2 instance in each additional Region and configure NGINX to act as a cache server for that Region Put all the EC2 instances and the ALB behind an Amazon Route 53 record set with a geotocation routing policy
D.
Launch an EC2 instance in each additional Region and configure NGINX to act as a cache server for that Region Put all the EC2 instances and the ALB behind an Amazon Route 53 record set with a geotocation routing policy
Answers
Suggested answer: B

A telemarketing company is designing its customer call center functionality on AWS. The company needs a solution that provides multiple speaker recognition and generates transcript files. The company wants to query the transcript files to analyze the business patterns. The transcript files must be stored for 7 years for auditing purposes.

Which solution will meet these requirements?

A.
Use Amazon Rekognition for multiple speaker recognition. Store the transcript files in Amazon S3. Use machine learning models for transcript file analysis.
A.
Use Amazon Rekognition for multiple speaker recognition. Store the transcript files in Amazon S3. Use machine learning models for transcript file analysis.
Answers
B.
Use Amazon Transcribe for multiple speaker recognition. Use Amazon Athena for transcript file analysis.
B.
Use Amazon Transcribe for multiple speaker recognition. Use Amazon Athena for transcript file analysis.
Answers
C.
Use Amazon Translate for multiple speaker recognition. Store the transcript files in Amazon Redshift. Use SQL queries for transcript file analysis.
C.
Use Amazon Translate for multiple speaker recognition. Store the transcript files in Amazon Redshift. Use SQL queries for transcript file analysis.
Answers
D.
Use Amazon Rekognition for multiple speaker recognition. Store the transcript files in Amazon S3. Use Amazon Textract for transcript file analysis.
D.
Use Amazon Rekognition for multiple speaker recognition. Store the transcript files in Amazon S3. Use Amazon Textract for transcript file analysis.
Answers
Suggested answer: B

Explanation:

Amazon Transcribe now supports speaker labeling for streaming transcription. Amazon Transcribe is an automatic speech recognition (ASR) service that makes it easy for you to convert speech-to-text.In live audio transcription, each stream of audio may contain multiple speakers. Now you can conveniently turn on the ability to label speakers, thus helping to identify who is saying what in theoutput transcript. https://aws.amazon.com/about-aws/whats-new/2020/08/amazon-transcribe- supports-speaker-labeling-streaming-transcription/


A company is building a new dynamic ordering website. The company wants to minimize server maintenance and patching. The website must be highly available and must scale read and write capacity as quickly as possible to meet changes in user demand.

Which solution will meet these requirements?

A.
Host static content in Amazon S3 Host dynamic content by using Amazon API Gateway and AWS Lambda Use Amazon DynamoDB with on-demand capacity for the database Configure Amazon CtoudFront to deliver the website content
A.
Host static content in Amazon S3 Host dynamic content by using Amazon API Gateway and AWS Lambda Use Amazon DynamoDB with on-demand capacity for the database Configure Amazon CtoudFront to deliver the website content
Answers
B.
Host static content in Amazon S3 Host dynamic content by using Amazon API Gateway and AWS Lambda Use Amazon Aurora with Aurora Auto Scaling for the database Configure Amazon CloudFront to deliver the website content
B.
Host static content in Amazon S3 Host dynamic content by using Amazon API Gateway and AWS Lambda Use Amazon Aurora with Aurora Auto Scaling for the database Configure Amazon CloudFront to deliver the website content
Answers
C.
Host al the website content on Amazon EC2 instances Create an Auto Scaling group to scale the EC2 Instances Use an Application Load Balancer to distribute traffic Use Amazon DynamoDB with provisioned write capacity for the database
C.
Host al the website content on Amazon EC2 instances Create an Auto Scaling group to scale the EC2 Instances Use an Application Load Balancer to distribute traffic Use Amazon DynamoDB with provisioned write capacity for the database
Answers
D.
Host at the website content on Amazon EC2 instances Create an Auto Scaling group to scale the EC2 instances Use an Application Load Balancer to distribute traffic Use Amazon Aurora with Aurora Auto Scaling for the database
D.
Host at the website content on Amazon EC2 instances Create an Auto Scaling group to scale the EC2 instances Use an Application Load Balancer to distribute traffic Use Amazon Aurora with Aurora Auto Scaling for the database
Answers
Suggested answer: A

A company hosts its application on AWS The company uses Amazon Cognito to manage users When users log in to the application the application fetches required data from Amazon DynamoOB by using a REST API that is hosted in Amazon API Gateway. The company wants an AWS managed solution that will control access to the REST API to reduce development efforts

Which solution will meet these requirements with the LEAST operational overhead?

A.
Configure an AWS Lambda function to be an authorize! in API Gateway to validate which user made the request
A.
Configure an AWS Lambda function to be an authorize! in API Gateway to validate which user made the request
Answers
B.
For each user, create and assign an API key that must be sent with each request Validate the key by using an AWS Lambda function
B.
For each user, create and assign an API key that must be sent with each request Validate the key by using an AWS Lambda function
Answers
C.
Send the user's email address in the header with every request Invoke an AWS Lambda function to validate that the user with that email address has proper access
C.
Send the user's email address in the header with every request Invoke an AWS Lambda function to validate that the user with that email address has proper access
Answers
D.
Configure an Amazon Cognito user pool authorizer in API Gateway to allow Amazon Cognito to validate each request
D.
Configure an Amazon Cognito user pool authorizer in API Gateway to allow Amazon Cognito to validate each request
Answers
Suggested answer: D

A company must migrate 20 TB of data from a data center to the AWS Cloud within 30 days. The company's network bandwidth is limited to 15 Mbps and cannot exceed 70% utilization. What should a solutions architect do to meet these requirements?

A.
Use AWS Snowball.
A.
Use AWS Snowball.
Answers
B.
Use AWS DataSync.
B.
Use AWS DataSync.
Answers
C.
Use a secure VPN connection.
C.
Use a secure VPN connection.
Answers
D.
Use Amazon S3 Transfer Acceleration.
D.
Use Amazon S3 Transfer Acceleration.
Answers
Suggested answer: A

Explanation:

AWS Snowball is a secure data transport solution that accelerates moving large amounts of data into and out of the AWS cloud. It can move up to 80 TB of data at a time, and provides a network bandwidth of up to 50 Mbps, so it is well- suited for the task. Additionally, it is secure and easy to use, making it the ideal solution for this migration.

Total 886 questions
Go to page: of 89