ExamGecko
Home Home / Amazon / SAA-C03

Amazon SAA-C03 Practice Test - Questions Answers, Page 89

Question list
Search
Search

Related questions











A finance company is migrating its trading platform to AWS. The trading platform processes a high volume of market data and processes stock trades. The company needs to establish a consistent, low-latency network connection from its on-premises data center to AWS.

The company will host resources in a VPC. The solution must not use the public internet.

Which solution will meet these requirements?

A.

Use AWS Client VPN to connect the on-premises data center to AWS.

A.

Use AWS Client VPN to connect the on-premises data center to AWS.

Answers
B.

Use AWS Direct Connect to set up a connection from the on-premises data center to AWS

B.

Use AWS Direct Connect to set up a connection from the on-premises data center to AWS

Answers
C.

Use AWS PrivateLink to set up a connection from the on-premises data center to AWS.

C.

Use AWS PrivateLink to set up a connection from the on-premises data center to AWS.

Answers
D.

Use AWS Site-to-Site VPN to connect the on-premises data center to AWS.

D.

Use AWS Site-to-Site VPN to connect the on-premises data center to AWS.

Answers
Suggested answer: B

Explanation:

AWS Direct Connect is the best solution for establishing a consistent, low-latency connection from an on-premises data center to AWS without using the public internet. Direct Connect offers dedicated, high-throughput, and low-latency network connections, which are ideal for performance-sensitive applications like a trading platform that processes high volumes of market data and stock trades.

Direct Connect provides a private connection to your AWS VPC, ensuring that data doesn't traverse the public internet, which enhances both security and performance consistency.

AWS

Reference:

AWS Direct Connect provides a dedicated network connection to AWS services with consistent, low-latency performance.

Best Practices for High Performance on AWS for performance-sensitive workloads like trading platforms.

Why the other options are incorrect:

A . AWS Client VPN: While this offers secure connectivity, it's over the public internet and is not designed for the low-latency, high-performance needs of a trading platform.

C . AWS PrivateLink: PrivateLink is used for connecting VPCs and services within AWS, but it is not designed for connecting on-premises data centers to AWS.

D . AWS Site-to-Site VPN: Although this provides secure connectivity, it uses the public internet, which can introduce latency and doesn't meet the low-latency requirements of the use case.


A company tracks customer satisfaction by using surveys that the company hosts on its website. The surveys sometimes reach thousands of customers every hour. Survey results are currently sent in email messages to the company so company employees can manually review results and assess customer sentiment.

The company wants to automate the customer survey process. Survey results must be available for the previous 12 months.

Which solution will meet these requirements in the MOST scalable way?

A.

Send the survey results data to an Amazon API Gateway endpoint that is connected to an Amazon Simple Queue Service (Amazon SQS) queue. Create an AWS Lambda function to poll the SQS queue, call Amazon Comprehend for sentiment analysis, and save the results to an Amazon DynamoDB table. Set the TTL for all records to 365 days in the future.

A.

Send the survey results data to an Amazon API Gateway endpoint that is connected to an Amazon Simple Queue Service (Amazon SQS) queue. Create an AWS Lambda function to poll the SQS queue, call Amazon Comprehend for sentiment analysis, and save the results to an Amazon DynamoDB table. Set the TTL for all records to 365 days in the future.

Answers
B.

Send the survey results data to an API that is running on an Amazon EC2 instance. Configure the API to store the survey results as a new record in an Amazon DynamoDB table, call Amazon Comprehend for sentiment analysis, and save the results in a second DynamoDB table. Set the TTL for all records to 365 days in the future.

B.

Send the survey results data to an API that is running on an Amazon EC2 instance. Configure the API to store the survey results as a new record in an Amazon DynamoDB table, call Amazon Comprehend for sentiment analysis, and save the results in a second DynamoDB table. Set the TTL for all records to 365 days in the future.

Answers
C.

Write the survey results data to an Amazon S3 bucket. Use S3 Event Notifications to invoke an AWS Lambda function to read the data and call Amazon Rekognition for sentiment analysis. Store the sentiment analysis results in a second S3 bucket. Use S3 Lifecycle policies on each bucket to expire objects after 365 days.

C.

Write the survey results data to an Amazon S3 bucket. Use S3 Event Notifications to invoke an AWS Lambda function to read the data and call Amazon Rekognition for sentiment analysis. Store the sentiment analysis results in a second S3 bucket. Use S3 Lifecycle policies on each bucket to expire objects after 365 days.

Answers
D.

Send the survey results data to an Amazon API Gateway endpoint that is connected to an Amazon Simple Queue Service (Amazon SQS) queue. Configure the SQS queue to invoke an AWS Lambda function that calls Amazon Lex for sentiment analysis and saves the results to an Amazon DynamoDB table. Set the TTL for all records to 365 days in the future.

D.

Send the survey results data to an Amazon API Gateway endpoint that is connected to an Amazon Simple Queue Service (Amazon SQS) queue. Configure the SQS queue to invoke an AWS Lambda function that calls Amazon Lex for sentiment analysis and saves the results to an Amazon DynamoDB table. Set the TTL for all records to 365 days in the future.

Answers
Suggested answer: A

Explanation:

This solution is the most scalable and efficient way to handle large volumes of survey data while automating sentiment analysis:

API Gateway and SQS: The survey results are sent to API Gateway, which forwards the data to an SQS queue. SQS can handle large volumes of messages and ensures that messages are not lost.

AWS Lambda: Lambda is triggered by polling the SQS queue, where it processes the survey data.

Amazon Comprehend: Comprehend is used for sentiment analysis, providing insights into customer satisfaction.

DynamoDB with TTL: Results are stored in DynamoDB with a Time to Live (TTL) attribute set to expire after 365 days, automatically removing old data and reducing storage costs.

Option B (EC2 API): Running an API on EC2 requires more maintenance and scalability management compared to API Gateway.

Option C (S3 and Rekognition): Amazon Rekognition is for image and video analysis, not sentiment analysis.

Option D (Amazon Lex): Amazon Lex is used for building conversational interfaces, not sentiment analysis.

AWS

Reference:

Amazon Comprehend for Sentiment Analysis

Amazon SQS

DynamoDB TTL

An online gaming company is transitioning user data storage to Amazon DynamoDB to support the company's growing user base. The current architecture includes DynamoDB tables that contain user profiles, achievements, and in-game transactions.

The company needs to design a robust, continuously available, and resilient DynamoDB architecture to maintain a seamless gaming experience for users.

Which solution will meet these requirements MOST cost-effectively?

A.

Create DynamoDB tables in a single AWS Region. Use on-demand capacity mode. Use global tables to replicate data across multiple Regions.

A.

Create DynamoDB tables in a single AWS Region. Use on-demand capacity mode. Use global tables to replicate data across multiple Regions.

Answers
B.

Use DynamoDB Accelerator (DAX) to cache frequently accessed data. Deploy tables in a single AWS Region and enable auto scaling. Configure Cross-Region Replication manually to additional Regions.

B.

Use DynamoDB Accelerator (DAX) to cache frequently accessed data. Deploy tables in a single AWS Region and enable auto scaling. Configure Cross-Region Replication manually to additional Regions.

Answers
C.

Create DynamoDB tables in multiple AWS Regions. Use on-demand capacity mode. Use DynamoDB Streams for Cross-Region Replication between Regions.

C.

Create DynamoDB tables in multiple AWS Regions. Use on-demand capacity mode. Use DynamoDB Streams for Cross-Region Replication between Regions.

Answers
D.

Use DynamoDB global tables for automatic multi-Region replication. Deploy tables in multiple AWS Regions. Use provisioned capacity mode. Enable auto scaling.

D.

Use DynamoDB global tables for automatic multi-Region replication. Deploy tables in multiple AWS Regions. Use provisioned capacity mode. Enable auto scaling.

Answers
Suggested answer: D

Explanation:

DynamoDB Global Tables provide a fully managed, multi-region, and multi-master database solution that allows you to deploy DynamoDB tables in multiple AWS Regions. This ensures high availability and resiliency across different geographical locations, providing a seamless gaming experience for users. Using provisioned capacity mode with auto-scaling ensures cost-efficiency by scaling up or down based on actual demand.

Option A: While on-demand capacity mode is flexible, provisioned capacity with auto-scaling is more cost-effective for predictable workloads.

Option B (DAX): DAX improves read performance, but it doesn't provide the multi-region replication needed for high availability and resiliency.

Option C: DynamoDB Streams with manual cross-region replication adds more complexity and operational overhead compared to Global Tables.

AWS

Reference:

DynamoDB Global Tables

A logistics company is creating a data exchange platform to share shipment status information with shippers. The logistics company can see all shipment information and metadata. The company distributes shipment data updates to shippers.

Each shipper should see only shipment updates that are relevant to their company. Shippers should not see the full detail that is visible to the logistics company. The company creates an Amazon Simple Notification Service (Amazon SNS) topic for each shipper to share data. Some shippers use a mobile app to submit shipment status updates.

The company needs to create a data exchange platform that provides each shipper specific access to the data that is relevant to their company.

Which solution will meet these requirements with the LEAST operational overhead?

A.

Ingest the shipment updates from the mobile app into Amazon Simple Queue Service (Amazon SQS). Publish the updates to the SNS topic. Apply a filter policy to rewrite the body of each message.

A.

Ingest the shipment updates from the mobile app into Amazon Simple Queue Service (Amazon SQS). Publish the updates to the SNS topic. Apply a filter policy to rewrite the body of each message.

Answers
B.

Ingest the shipment updates from the mobile app into Amazon Simple Queue Service (Amazon SQS). Use an AWS Lambda function to consume the updates from Amazon SQS and rewrite the body of each message. Publish the updates to the SNS topic.

B.

Ingest the shipment updates from the mobile app into Amazon Simple Queue Service (Amazon SQS). Use an AWS Lambda function to consume the updates from Amazon SQS and rewrite the body of each message. Publish the updates to the SNS topic.

Answers
C.

Ingest the shipment updates from the mobile app into a second SNS topic. Publish the updates to the shipper SNS topic. Apply a filter policy to rewrite the body of each message.

C.

Ingest the shipment updates from the mobile app into a second SNS topic. Publish the updates to the shipper SNS topic. Apply a filter policy to rewrite the body of each message.

Answers
D.

Ingest the shipment updates from the mobile app into Amazon Simple Queue Service (Amazon SQS). Filter and rewrite the messages in Amazon EventBridge Pipes. Publish the updates to the SNS topic.

D.

Ingest the shipment updates from the mobile app into Amazon Simple Queue Service (Amazon SQS). Filter and rewrite the messages in Amazon EventBridge Pipes. Publish the updates to the SNS topic.

Answers
Suggested answer: B

Explanation:

The best solution is to use Amazon SQS to receive updates from the mobile app and process them with an AWS Lambda function. The Lambda function can rewrite the message body as necessary for each shipper and then publish the updates to the appropriate SNS topic for distribution. This setup ensures that each shipper receives only the relevant data and minimizes operational overhead by using managed services.

Option A (SNS filter policy): SNS does not have the capability to rewrite message bodies before forwarding.

Option C (Second SNS topic): Using an additional SNS topic adds unnecessary complexity without solving the message rewriting requirement.

Option D (EventBridge Pipes): EventBridge Pipes is more complex than necessary for this use case, and Lambda can handle the logic more efficiently.

AWS

Reference:

Amazon SQS

Amazon SNS with Lambda

A company is deploying a new gaming application on Amazon EC2 instances. The gaming application needs to have access to shared storage.

The company requires a high-performance solution to give the application the ability to use an existing custom protocol to access shared storage. The solution must ensure low latency and must be operationally efficient.

Which solution will meet these requirements?

A.

Create an Amazon FSx File Gateway. Create a file share that uses the existing custom protocol. Connect the EC2 instances that host the application to the file share.

A.

Create an Amazon FSx File Gateway. Create a file share that uses the existing custom protocol. Connect the EC2 instances that host the application to the file share.

Answers
B.

Create an Amazon EC2 Windows instance. Install and configure a Windows file share role on the instance. Connect the EC2 instances that host the application to the file share.

B.

Create an Amazon EC2 Windows instance. Install and configure a Windows file share role on the instance. Connect the EC2 instances that host the application to the file share.

Answers
C.

Create an Amazon Elastic File System (Amazon EFS) file system. Configure the file system to support Lustre. Connect the EC2 instances that host the application to the file system.

C.

Create an Amazon Elastic File System (Amazon EFS) file system. Configure the file system to support Lustre. Connect the EC2 instances that host the application to the file system.

Answers
D.

Create an Amazon FSx for Lustre file system. Connect the EC2 instances that host the application to the file system.

D.

Create an Amazon FSx for Lustre file system. Connect the EC2 instances that host the application to the file system.

Answers
Suggested answer: D

Explanation:

Amazon FSx for Lustre is a high-performance, fully managed file system that is ideal for applications requiring low-latency access to shared storage, especially in use cases like gaming where high throughput and low latency are essential. It integrates easily with EC2 instances, providing fast and scalable shared storage, and supports custom protocols for specific application needs.

Option A (FSx File Gateway): FSx File Gateway is designed for hybrid cloud storage and is not suited for high-performance gaming workloads.

Option B (EC2 Windows instance): Setting up a file share on a Windows instance would introduce additional administrative overhead and would not provide the necessary performance.

Option C (EFS with Lustre): While Lustre is integrated with FSx, EFS does not natively support Lustre.

AWS

Reference:

Amazon FSx for Lustre

A company is developing a rating system for its ecommerce web application. The company needs a solution to save ratings that users submit in an Amazon DynamoDB table.

The company wants to ensure that developers do not need to interact directly with the DynamoDB table. The solution must be scalable and reusable.

Which solution will meet these requirements with the LEAST operational overhead?

A.

Create an Application Load Balancer (ALB). Create an AWS Lambda function, and set the function as a target group in the ALB. Invoke the Lambda function by using the put_item method through the ALB.

A.

Create an Application Load Balancer (ALB). Create an AWS Lambda function, and set the function as a target group in the ALB. Invoke the Lambda function by using the put_item method through the ALB.

Answers
B.

Create an AWS Lambda function. Configure the Lambda function to interact with the DynamoDB table by using the put-item method from Boto3. Invoke the Lambda function from the web application.

B.

Create an AWS Lambda function. Configure the Lambda function to interact with the DynamoDB table by using the put-item method from Boto3. Invoke the Lambda function from the web application.

Answers
C.

Create an Amazon Simple Queue Service (Amazon SQS) queue and an AWS Lambda function that has an SQS trigger type. Instruct the developers to add customer ratings to the SQS queue as JSON messages. Configure the Lambda function to fetch the ratings from the queue and store the ratings in DynamoDB.

C.

Create an Amazon Simple Queue Service (Amazon SQS) queue and an AWS Lambda function that has an SQS trigger type. Instruct the developers to add customer ratings to the SQS queue as JSON messages. Configure the Lambda function to fetch the ratings from the queue and store the ratings in DynamoDB.

Answers
D.

Create an Amazon API Gateway REST API Define a resource and create a new POST method Choose AWS as the integration type, and select DynamoDB as the service. Set the action to PutItem.

D.

Create an Amazon API Gateway REST API Define a resource and create a new POST method Choose AWS as the integration type, and select DynamoDB as the service. Set the action to PutItem.

Answers
Suggested answer: D

Explanation:

Amazon API Gateway provides a scalable and reusable solution for interacting with DynamoDB without requiring direct access by developers. By setting up a REST API with a POST method that integrates with DynamoDB's PutItem action, developers can submit data (such as user ratings) to the DynamoDB table through API Gateway, without having to directly interact with the database. This solution is serverless and minimizes operational overhead.

Option A: Using ALB with Lambda adds complexity and is less efficient for this use case.

Option B: While using Lambda is possible, API Gateway provides a more scalable, reusable interface.

Option C: SQS with Lambda introduces unnecessary components for a simple put operation.

AWS

Reference:

Amazon API Gateway with DynamoDB

Total 886 questions
Go to page: of 89