ExamGecko
Home Home / Amazon / SAA-C03

Amazon SAA-C03 Practice Test - Questions Answers, Page 40

Question list
Search
Search

List of questions

Search

Related questions











A company has deployed a Java Spring Boot application as a pod that runs on Amazon Elastic Kubernetes Service (Amazon EKS) in private subnets. The application needs to write data to an Amazon DynamoDB table. A solutions architect must ensure that the application can interact with the DynamoDB table without exposing traffic to the internet. Which combination of steps should the solutions architect take to accomplish this goal? (Choose two.)

A.
Attach an IAM role that has sufficient privileges to the EKS pod.
A.
Attach an IAM role that has sufficient privileges to the EKS pod.
Answers
B.
Attach an IAM user that has sufficient privileges to the EKS pod.
B.
Attach an IAM user that has sufficient privileges to the EKS pod.
Answers
C.
Allow outbound connectivity to the DynamoDB table through the private subnets’ network ACLs.
C.
Allow outbound connectivity to the DynamoDB table through the private subnets’ network ACLs.
Answers
D.
Create a VPC endpoint for DynamoDB.
D.
Create a VPC endpoint for DynamoDB.
Answers
E.
Embed the access keys in the Java Spring Boot code.
E.
Embed the access keys in the Java Spring Boot code.
Answers
Suggested answer: A, D

Explanation:

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/vpc-endpointsdynamodb.html

https://aws.amazon.com/about-aws/whats-new/2019/09/amazon-eks-adds-support-to-assign-iampermissions-to-kubernetes-service-accounts/

A company recently migrated its web application to AWS by rehosting the application on Amazon EC2 instances in a single AWS Region. The company wants to redesign its application architecture to be highly available and fault tolerant. Traffic must reach all running EC2 instances randomly.

Which combination of steps should the company take to meet these requirements? (Choose two.)

A.
Create an Amazon Route 53 failover routing policy.
A.
Create an Amazon Route 53 failover routing policy.
Answers
B.
Create an Amazon Route 53 weighted routing policy.
B.
Create an Amazon Route 53 weighted routing policy.
Answers
C.
Create an Amazon Route 53 multivalue answer routing policy.
C.
Create an Amazon Route 53 multivalue answer routing policy.
Answers
D.
Launch three EC2 instances: two instances in one Availability Zone and one instance in another Availability Zone.
D.
Launch three EC2 instances: two instances in one Availability Zone and one instance in another Availability Zone.
Answers
E.
Launch four EC2 instances: two instances in one Availability Zone and two instances in another Availability Zone.
E.
Launch four EC2 instances: two instances in one Availability Zone and two instances in another Availability Zone.
Answers
Suggested answer: C, E

Explanation:

https://aws.amazon.com/premiumsupport/knowledge-center/multivalue-versus-simple-policies/

A media company collects and analyzes user activity data on premises. The company wants to migrate this capability to AWS. The user activity data store will continue to grow and will be petabytes in size. The company needs to build a highly available data ingestion solution that facilitates on-demand analytics of existing data and new data with SQL. Which solution will meet these requirements with the LEAST operational overhead?

A.
Send activity data to an Amazon Kinesis data stream. Configure the stream to deliver the data to an Amazon S3 bucket.
A.
Send activity data to an Amazon Kinesis data stream. Configure the stream to deliver the data to an Amazon S3 bucket.
Answers
B.
Send activity data to an Amazon Kinesis Data Firehose delivery stream. Configure the stream to deliver the data to an Amazon Redshift cluster.
B.
Send activity data to an Amazon Kinesis Data Firehose delivery stream. Configure the stream to deliver the data to an Amazon Redshift cluster.
Answers
C.
Place activity data in an Amazon S3 bucket. Configure Amazon S3 to run an AWS Lambda function on the data as the data arrives in the S3 bucket.
C.
Place activity data in an Amazon S3 bucket. Configure Amazon S3 to run an AWS Lambda function on the data as the data arrives in the S3 bucket.
Answers
D.
Create an ingestion service on Amazon EC2 instances that are spread across multiple Availability Zones. Configure the service to forward data to an Amazon RDS Multi-AZ database.
D.
Create an ingestion service on Amazon EC2 instances that are spread across multiple Availability Zones. Configure the service to forward data to an Amazon RDS Multi-AZ database.
Answers
Suggested answer: B

Explanation:

Amazon Redshift is a fully managed, petabyte-scale data warehouse service in the cloud. You can start with just a few hundred gigabytes of data and scale to a petabyte or more. This allows you to use your data to gain new insights for your business and customers. The first step to create a data warehouse is to launch a set of nodes, called an Amazon Redshift cluster. After you provision your cluster, you can upload your data set and then perform data analysis queries. Regardless of the size of the data set, Amazon Redshift offers fast query performance using the same SQL-based tools and business intelligence applications that you use today.

A company has a serverless website with millions of objects in an Amazon S3 bucket. The company uses the S3 bucket as the origin for an Amazon CloudFront distribution. The company did not set encryption on the S3 bucket before the objects were loaded. A solutions architect needs to enable encryption for all existing objects and for all objects that are added to the S3 bucket in the future. Which solution will meet these requirements with the LEAST amount of effort?

A.
Create a new S3 bucket. Turn on the default encryption settings for the new S3 bucket. Download all existing objects to temporary local storage. Upload the objects to the new S3 bucket.
A.
Create a new S3 bucket. Turn on the default encryption settings for the new S3 bucket. Download all existing objects to temporary local storage. Upload the objects to the new S3 bucket.
Answers
B.
Turn on the default encryption settings for the S3 bucket. Use the S3 Inventory feature to create a .csv file that lists the unencrypted objects. Run an S3 Batch Operations job that uses the copy command to encrypt those objects.
B.
Turn on the default encryption settings for the S3 bucket. Use the S3 Inventory feature to create a .csv file that lists the unencrypted objects. Run an S3 Batch Operations job that uses the copy command to encrypt those objects.
Answers
C.
Create a new encryption key by using AWS Key Management Service (AWS KMS). Change the settings on the S3 bucket to use server-side encryption with AWS KMS managed encryption keys (SSE-KMS). Turn on versioning for the S3 bucket.
C.
Create a new encryption key by using AWS Key Management Service (AWS KMS). Change the settings on the S3 bucket to use server-side encryption with AWS KMS managed encryption keys (SSE-KMS). Turn on versioning for the S3 bucket.
Answers
D.
Navigate to Amazon S3 in the AWS Management Console. Browse the S3 bucket’s objects. Sort by the encryption field. Select each unencrypted object. Use the Modify button to apply default encryption settings to every unencrypted object in the S3 bucket.
D.
Navigate to Amazon S3 in the AWS Management Console. Browse the S3 bucket’s objects. Sort by the encryption field. Select each unencrypted object. Use the Modify button to apply default encryption settings to every unencrypted object in the S3 bucket.
Answers
Suggested answer: B

Explanation:

https://spin.atomicobject.com/2020/09/15/aws-s3-encrypt-existing-objects/

A solutions architect is designing a new API using Amazon API Gateway that will receive requests from users. The volume of requests is highly variable; several hours can pass without receiving a single request. The data processing will take place asynchronously, but should be completed within a few seconds after a request is made.

Which compute service should the solutions architect have the API invoke to deliver the requirements at the lowest cost?

A.
An AWS Glue job
A.
An AWS Glue job
Answers
B.
An AWS Lambda function
B.
An AWS Lambda function
Answers
C.
A containerized service hosted in Amazon Elastic Kubernetes Service (Amazon EKS)
C.
A containerized service hosted in Amazon Elastic Kubernetes Service (Amazon EKS)
Answers
D.
A containerized service hosted in Amazon ECS with Amazon EC2
D.
A containerized service hosted in Amazon ECS with Amazon EC2
Answers
Suggested answer: B

Explanation:

API Gateway + Lambda is the perfect solution for modern applications with serverless architecture.

A company hosts multiple production applications. One of the applications consists of resources from Amazon EC2, AWS Lambda, Amazon RDS, Amazon Simple Notification Service (Amazon SNS), and Amazon Simple Queue Service (Amazon SQS) across multiple AWS Regions. All company resources are tagged with a tag name of “application” and a value that corresponds to each application. A solutions architect must provide the quickest solution for identifying all of the tagged components.

Which solution meets these requirements?

A.
Use AWS CloudTrail to generate a list of resources with the application tag.
A.
Use AWS CloudTrail to generate a list of resources with the application tag.
Answers
B.
Use the AWS CLI to query each service across all Regions to report the tagged components.
B.
Use the AWS CLI to query each service across all Regions to report the tagged components.
Answers
C.
Run a query in Amazon CloudWatch Logs Insights to report on the components with the application tag.
C.
Run a query in Amazon CloudWatch Logs Insights to report on the components with the application tag.
Answers
D.
Run a query with the AWS Resource Groups Tag Editor to report on the resources globally with the application tag.
D.
Run a query with the AWS Resource Groups Tag Editor to report on the resources globally with the application tag.
Answers
Suggested answer: D

Explanation:

https://docs.aws.amazon.com/tag-editor/latest/userguide/tagging.html

A company’s reporting system delivers hundreds of .csv files to an Amazon S3 bucket each day. The company must convert these files to Apache Parquet format and must store the files in a transformed data bucket. Which solution will meet these requirements with the LEAST development effort?

A.
Create an Amazon EMR cluster with Apache Spark installed. Write a Spark application to transform the data. Use EMR File System (EMRFS) to write files to the transformed data bucket.
A.
Create an Amazon EMR cluster with Apache Spark installed. Write a Spark application to transform the data. Use EMR File System (EMRFS) to write files to the transformed data bucket.
Answers
B.
Create an AWS Glue crawler to discover the data. Create an AWS Glue extract, transform, and load (ETL) job to transform the data. Specify the transformed data bucket in the output step.
B.
Create an AWS Glue crawler to discover the data. Create an AWS Glue extract, transform, and load (ETL) job to transform the data. Specify the transformed data bucket in the output step.
Answers
C.
Use AWS Batch to create a job definition with Bash syntax to transform the data and output the data to the transformed data bucket. Use the job definition to submit a job. Specify an array job as the job type.
C.
Use AWS Batch to create a job definition with Bash syntax to transform the data and output the data to the transformed data bucket. Use the job definition to submit a job. Specify an array job as the job type.
Answers
D.
Create an AWS Lambda function to transform the data and output the data to the transformed data bucket. Configure an event notification for the S3 bucket. Specify the Lambda function as the destination for the event notification.
D.
Create an AWS Lambda function to transform the data and output the data to the transformed data bucket. Configure an event notification for the S3 bucket. Specify the Lambda function as the destination for the event notification.
Answers
Suggested answer: B

Explanation:

https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/three-aws-glue-etl-job-typesfor-converting-data-to-apache-parquet.html

A company wants to manage Amazon Machine Images (AMIs). The company currently copies AMIs to the same AWS Region where the AMIs were created. The company needs to design an application that captures AWS API calls and sends alerts whenever the Amazon EC2 CreateImage API operation is called within the company’s account. Which solution will meet these requirements with the LEAST operational overhead?

A.
Create an AWS Lambda function to query AWS CloudTrail logs and to send an alert when a CreateImage API call is detected.
A.
Create an AWS Lambda function to query AWS CloudTrail logs and to send an alert when a CreateImage API call is detected.
Answers
B.
Configure AWS CloudTrail with an Amazon Simple Notification Service (Amazon SNS) notification that occurs when updated logs are sent to Amazon S3. Use Amazon Athena to create a new table and to query on CreateImage when an API call is detected.
B.
Configure AWS CloudTrail with an Amazon Simple Notification Service (Amazon SNS) notification that occurs when updated logs are sent to Amazon S3. Use Amazon Athena to create a new table and to query on CreateImage when an API call is detected.
Answers
C.
Create an Amazon EventBridge (Amazon CloudWatch Events) rule for the CreateImage API call.Configure the target as an Amazon Simple Notification Service (Amazon SNS) topic to send an alert when a CreateImage API call is detected.
C.
Create an Amazon EventBridge (Amazon CloudWatch Events) rule for the CreateImage API call.Configure the target as an Amazon Simple Notification Service (Amazon SNS) topic to send an alert when a CreateImage API call is detected.
Answers
D.
Configure an Amazon Simple Queue Service (Amazon SQS) FIFO queue as a target for AWS CloudTrail logs. Create an AWS Lambda function to send an alert to an Amazon Simple Notification Service (Amazon SNS) topic when a CreateImage API call is detected.
D.
Configure an Amazon Simple Queue Service (Amazon SQS) FIFO queue as a target for AWS CloudTrail logs. Create an AWS Lambda function to send an alert to an Amazon Simple Notification Service (Amazon SNS) topic when a CreateImage API call is detected.
Answers
Suggested answer: C

Explanation:

https://docs.aws.amazon.com/AWSEC2/latest/WindowsGuide/monitor-amievents.html#:~:text=For%20example%2C%20you%20can%20create%20an%20EventBridge%20rule%20that%20detects%20when%20the%20AMI%20creation% 20process%20has%20completed%20and%20then%20invokes%20an%20Amazon%20SNS%20topic%20to%20send%20an%20email%20notification%20to%20you.

A company offers a food delivery service that is growing rapidly. Because of the growth, the company’s order processing system is experiencing scaling problems during peak traffic hours. The current architecture includes the following:

• A group of Amazon EC2 instances that run in an Amazon EC2 Auto Scaling group to collect orders from the application • Another group of EC2 instances that run in an Amazon EC2 Auto Scaling group to fulfill orders The order collection process occurs quickly, but the order fulfillment process can take longer. Data must not be lost because of a scaling event. A solutions architect must ensure that the order collection process and the order fulfillment process can both scale properly during peak traffic hours. The solution must optimize utilization of the company’s AWS resources. Which solution meets these requirements?

A.
Use Amazon CloudWatch metrics to monitor the CPU of each instance in the Auto Scaling groups.Configure each Auto Scaling group’s minimum capacity according to peak workload values.
A.
Use Amazon CloudWatch metrics to monitor the CPU of each instance in the Auto Scaling groups.Configure each Auto Scaling group’s minimum capacity according to peak workload values.
Answers
B.
Use Amazon CloudWatch metrics to monitor the CPU of each instance in the Auto Scaling groups.Configure a CloudWatch alarm to invoke an Amazon Simple Notification Service (Amazon SNS) topic that creates additional Auto Scaling groups on demand.
B.
Use Amazon CloudWatch metrics to monitor the CPU of each instance in the Auto Scaling groups.Configure a CloudWatch alarm to invoke an Amazon Simple Notification Service (Amazon SNS) topic that creates additional Auto Scaling groups on demand.
Answers
C.
Provision two Amazon Simple Queue Service (Amazon SQS) queues: one for order collection and another for order fulfillment. Configure the EC2 instances to poll their respective queue. Scale the Auto Scaling groups based on notifications that the queues send.
C.
Provision two Amazon Simple Queue Service (Amazon SQS) queues: one for order collection and another for order fulfillment. Configure the EC2 instances to poll their respective queue. Scale the Auto Scaling groups based on notifications that the queues send.
Answers
D.
Provision two Amazon Simple Queue Service (Amazon SQS) queues: one for order collection and another for order fulfillment. Configure the EC2 instances to poll their respective queue. Create a metric based on a backlog per instance calculation. Scale the Auto Scaling groups based on this metric.
D.
Provision two Amazon Simple Queue Service (Amazon SQS) queues: one for order collection and another for order fulfillment. Configure the EC2 instances to poll their respective queue. Create a metric based on a backlog per instance calculation. Scale the Auto Scaling groups based on this metric.
Answers
Suggested answer: D

Explanation:

The number of instances in your Auto Scaling group can be driven by how long it takes to process a message and the acceptable amount of latency (queue delay). The solution is to use a backlog per instance metric with the target value being the acceptable backlog per instance to maintain.

A company is developing a marketing communications service that targets mobile app users. The company needs to send confirmation messages with Short Message Service (SMS) to its users. The users must be able to reply to the SMS messages. The company must store the responses for a year for analysis.

What should a solutions architect do to meet these requirements?

A.
Create an Amazon Connect contact flow to send the SMS messages. Use AWS Lambda to process the responses.
A.
Create an Amazon Connect contact flow to send the SMS messages. Use AWS Lambda to process the responses.
Answers
B.
Build an Amazon Pinpoint journey. Configure Amazon Pinpoint to send events to an Amazon Kinesis data stream for analysis and archiving.
B.
Build an Amazon Pinpoint journey. Configure Amazon Pinpoint to send events to an Amazon Kinesis data stream for analysis and archiving.
Answers
C.
Use Amazon Simple Queue Service (Amazon SQS) to distribute the SMS messages. Use AWS Lambda to process the responses.
C.
Use Amazon Simple Queue Service (Amazon SQS) to distribute the SMS messages. Use AWS Lambda to process the responses.
Answers
D.
Create an Amazon Simple Notification Service (Amazon SNS) FIFO topic. Subscribe an Amazon Kinesis data stream to the SNS topic for analysis and archiving.
D.
Create an Amazon Simple Notification Service (Amazon SNS) FIFO topic. Subscribe an Amazon Kinesis data stream to the SNS topic for analysis and archiving.
Answers
Suggested answer: B

Explanation:

https://aws.amazon.com/pinpoint/product-details/sms/ Two-Way Messaging: Receive SMSmessages from your customers and reply back to them in a chat-like interactive experience. With Amazon Pinpoint, you can create automatic responses when customers send you messages that contain certain keywords. You can even use Amazon Lex to create conversational bots. A majority of mobile phone users read incoming SMS messages almost immediately after receiving them. If you need to be able to provide your customers with urgent or important information, SMS messaging may be the right solution for you. You can use Amazon Pinpoint to create targeted groups of customers, and then send them campaign-based messages. You can also use Amazon Pinpoint to send direct messages, such as appointment confirmations, order updates, and one-time passwords.

Total 886 questions
Go to page: of 89