Amazon SAA-C03 Practice Test - Questions Answers, Page 53
List of questions
Question 521
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A social media company runs its application on Amazon EC2 instances behind an Application Load Balancer (ALB). The ALB is the origin for an Amazon CloudFront distribution. The application has more than a billion images stored in an Amazon S3 bucket and processes thousands of images each second. The company wants to resize the images dynamically and serve appropriate formats to clients.
Which solution will meet these requirements with the LEAST operational overhead?
Explanation:
To resize images dynamically and serve appropriate formats to clients, a Lambda@Edge function with an external image management library can be used. Lambda@Edge allows running custom code at the edge locations of CloudFront, which can process the images on the fly and optimize them for different devices and browsers. An external image management library can provide various image manipulation and optimization features.
Reference:
Lambda@Edge
Resizing Images with Amazon CloudFront & Lambda@Edge
Question 522
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company hosts an application on Amazon EC2 instances that run in a single Availability Zone. The application is accessible by using the transport layer of the Open Systems Interconnection (OSI) model. The company needs the application architecture to have high availability Which combination of steps will meet these requirements MOST cost-effectively? (Select TWO_)
Explanation:
To achieve high availability for an application that runs on EC2 instances, the application should be deployed across multiple Availability Zones and use a load balancer to distribute traffic. An Auto Scaling group can be used to launch and manage EC2 instances in multiple Availability Zones and perform health checks on them. A Network Load Balancer can be used to handle transport layer traffic to the EC2 instances.
Reference:
Auto Scaling Groups
What Is a Network Load Balancer?
Question 523
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company is preparing a new data platform that will ingest real-time streaming data from multiple sources. The company needs to transform the data before writing the data to Amazon S3. The company needs the ability to use SQL to query the transformed data.
Which solutions will meet these requirements? (Choose two.)
Explanation:
To ingest, transform, and query real-time streaming data from multiple sources, Amazon Kinesis and Amazon MSK are suitable solutions. Amazon Kinesis Data Streams can stream the data from various sources and integrate with other AWS services. Amazon Kinesis Data Analytics can transform the data using SQL or Apache Flink. Amazon Kinesis Data Firehose can write the data to Amazon S3 or other destinations. Amazon Athena can query the transformed data from Amazon S3 using standard SQL. Amazon MSK can stream the data using Apache Kafka, which is a popular open-source platform for streaming data. AWS Glue can transform the data using Apache Spark or Python scripts and write the data to Amazon S3 or other destinations. Amazon Athena can also query the transformed data from Amazon S3 using standard SQL.
Reference:
What Is Amazon Kinesis Data Streams?
What Is Amazon Kinesis Data Analytics?
What Is Amazon Kinesis Data Firehose?
What Is Amazon Athena?
What Is Amazon MSK?
What Is AWS Glue?
Question 524
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company has two VPCs that are located in the us-west-2 Region within the same AWS account. The company needs to allow network traffic between these VPCs. Approximately 500 GB of data transfer will occur between the VPCs each month.
What is the MOST cost-effective solution to connect these VPCs?
Explanation:
To connect two VPCs in the same Region within the same AWS account, VPC peering is the most costeffective solution. VPC peering allows direct network traffic between the VPCs without requiring a gateway, VPN connection, or AWS Transit Gateway. VPC peering also does not incur any additional charges for data transfer between the VPCs.
Reference:
What Is VPC Peering?
VPC Peering Pricing
Question 525
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company wants to use an event-driven programming model with AWS Lambd a. The company wants to reduce startup latency for Lambda functions that run on Java 11. The company does not have strict latency requirements for the applications. The company wants to reduce cold starts and outlier latencies when a function scales up.
Which solution will meet these requirements MOST cost-effectively?
Explanation:
To reduce startup latency for Lambda functions that run on Java 11, Lambda SnapStart is a suitable solution. Lambda SnapStart is a feature that enables faster cold starts and lower outlier latencies for Java 11 functions. Lambda SnapStart uses a pre-initialized Java Virtual Machine (JVM) to run the functions, which reduces the initialization time and memory footprint. Lambda SnapStart does not incur any additional charges.
Reference:
Lambda SnapStart for Java 11 Functions Lambda SnapStart FAQs
Question 526
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company has created a multi-tier application for its ecommerce website. The website uses an Application Load Balancer that resides in the public subnets, a web tier in the public subnets, and a MySQL cluster hosted on Amazon EC2 instances in the private subnets. The MySQL database needs to retrieve product catalog and pricing information that is hosted on the internet by a third-party provider. A solutions architect must devise a strategy that maximizes security without increasing operational overhead.
What should the solutions architect do to meet these requirements?
Explanation:
To allow the MySQL database in the private subnets to access the internet without exposing it to the public, a NAT gateway is a suitable solution. A NAT gateway enables instances in a private subnet to connect to the internet or other AWS services, but prevents the internet from initiating a connection with those instances. A NAT gateway resides in the public subnets and can handle high throughput of traffic with low latency. A NAT gateway is also a managed service that does not require any operational overhead.
Reference:
NAT Gateways
NAT Gateway Pricing
Question 527
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A solutions architect is designing a workload that will store hourly energy consumption by business tenants in a building. The sensors will feed a database through HTTP requests that will add up usage for each tenant. The solutions architect must use managed services when possible. The workload will receive more features in the future as the solutions architect adds independent components.
Which solution will meet these requirements with the LEAST operational overhead?
Explanation:
To use an event-driven programming model with AWS Lambda and reduce operational overhead, Amazon API Gateway and Amazon DynamoDB are suitable solutions. Amazon API Gateway can receive the data from the sensors and invoke AWS Lambda functions to process the data. AWS Lambda can run code without provisioning or managing servers, and scale automatically with the incoming requests. Amazon DynamoDB can store the data in a fast and flexible NoSQL database that can handle any amount of data with consistent performance.
Reference:
What Is Amazon API Gateway?
What Is AWS Lambda?
What Is Amazon DynamoDB?
Question 528
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company has a financial application that produces reports. The reports average 50 KB in size and are stored in Amazon S3. The reports are frequently accessed during the first week after production and must be stored for several years. The reports must be retrievable within 6 hours.
Which solution meets these requirements MOST cost-effectively?
Explanation:
To store and retrieve reports that are frequently accessed during the first week and must be stored for several years, S3 Standard and S3 Glacier are suitable solutions. S3 Standard offers high durability, availability, and performance for frequently accessed data. S3 Glacier offers secure and durable storage for long-term data archiving at a low cost. S3 Lifecycle rules can be used to transition the reports from S3 Standard to S3 Glacier after 7 days, which can reduce storage costs. S3 Glacier also supports retrieval within 6 hours.
Reference:
Storage Classes
Object Lifecycle Management
Retrieving Archived Objects from Amazon S3 Glacier
Question 529
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A company has data collection sensors at different locations. The data collection sensors stream a high volume of data to the company. The company wants to design a platform on AWS to ingest and process high-volume streaming dat a. The solution must be scalable and support data collection in near real time. The company must store the data in Amazon S3 for future reporting.
Which solution will meet these requirements with the LEAST operational overhead?
Explanation:
To ingest and process high-volume streaming data with the least operational overhead, Amazon Kinesis Data Firehose is a suitable solution. Amazon Kinesis Data Firehose can capture, transform, and deliver streaming data to Amazon S3 or other destinations. Amazon Kinesis Data Firehose can scale automatically to match the throughput of the data and handle any amount of data. Amazon Kinesis Data Firehose is also a fully managed service that does not require any servers to provision or manage.
Reference:
What Is Amazon Kinesis Data Firehose?
Amazon Kinesis Data Firehose Pricing
Question 530
![Export Export](https://examgecko.com/assets/images/icon-download-24.png)
A solutions architect is designing the storage architecture for a new web application used for storing and viewing engineering drawings. All application components will be deployed on the AWS infrastructure.
The application design must support caching to minimize the amount of time that users wait for the engineering drawings to load. The application must be able to store petabytes of dat a. Which combination of storage and caching should the solutions architect use?
Explanation:
To store and view engineering drawings with caching support, Amazon S3 and Amazon CloudFront are suitable solutions. Amazon S3 can store any amount of data with high durability, availability, and performance. Amazon CloudFront can distribute the engineering drawings to edge locations closer to the users, which can reduce the latency and improve the user experience. Amazon CloudFront can also cache the engineering drawings at the edge locations, which can minimize the amount of time that users wait for the drawings to load.
Reference:
What Is Amazon S3?
What Is Amazon CloudFront?
Question