Amazon SAA-C03 Practice Test - Questions Answers, Page 55
List of questions
Question 541

A company runs a three-tier application in two AWS Regions. The web tier, the application tier, and the database tier run on Amazon EC2 instances. The company uses Amazon RDS for Microsoft SQL Server Enterprise for the database tier The database tier is experiencing high load when weekly and monthly reports are run. The company wants to reduce the load on the database tier.
Which solution will meet these requirements with the LEAST administrative effort?
Explanation:
it allows the company to create read replicas of its RDS database and reduce the load on the database tier. By creating read replicas, the company can offload read traffic from the primary database instance to one or more replicas. By configuring the reports to use the new read replicas, the company can improve performance and availability of its database tier. Reference:
Working with Read Replicas
Read Replicas for Amazon RDS for SQL Server
Question 542

A company runs a website that stores images of historical events. Website users need the ability to search and view images based on the year that the event in the image occurred. On average, users request each image only once or twice a year The company wants a highly available solution to store and deliver the images to users.
Which solution will meet these requirements MOST cost-effectively?
Explanation:
it allows the company to store and deliver images to users in a highly available and cost-effective way. By storing images in Amazon S3 Standard, the company can use a durable, scalable, and secure object storage service that offers high availability and performance. By using S3 Standard to directly deliver images by using a static website, the company can avoid running web servers and reduce operational overhead. S3 Standard also offers low storage pricing and free data transfer within AWS Regions. Reference:
Amazon S3 Storage Classes
Hosting a Static Website on Amazon S3
Question 543

A solutions architect needs to review a company's Amazon S3 buckets to discover personally identifiable information (Pll). The company stores the Pll data in the us-east-I Region and us-west-2 Region.
Which solution will meet these requirements with the LEAST operational overhead?
Explanation:
it allows the solutions architect to review the S3 buckets to discover personally identifiable information (Pll) with the least operational overhead. Amazon Macie is a fully managed data security and data privacy service that uses machine learning and pattern matching to discover and protect sensitive data in AWS. Amazon Macie can analyze data in S3 buckets across multiple regions and provide insights into the type, location, and level of sensitivity of the data. Reference:
Amazon Macie
Analyzing data with Amazon Macie
Question 544

A company is building an ecommerce application and needs to store sensitive customer information.
The company needs to give customers the ability to complete purchase transactions on the website.
The company also needs to ensure that sensitive customer data is protected, even from database administrators.
Which solution meets these requirements?
Explanation:
it allows the company to store sensitive customer information in a managed AWS service and give customers the ability to complete purchase transactions on the website. By using AWS Key Management Service (AWS KMS) client-side encryption, the company can encrypt the data before sending it to Amazon RDS for MySQL. This ensures that sensitive customer data is protected, even from database administrators, as only the application has access to the encryption keys. Reference:
Using Encryption with Amazon RDS for MySQL
Encrypting Amazon RDS Resources
Question 545

A solutions architect needs to ensure that API calls to Amazon DynamoDB from Amazon EC2 instances in a VPC do not travel across the internet.
Which combination of steps should the solutions architect take to meet this requirement? (Choose two.)
Explanation:
B and E are the correct answers because they allow the solutions architect to ensure that API calls to Amazon DynamoDB from Amazon EC2 instances in a VPC do not travel across the internet. By creating a gateway endpoint for DynamoDB, the solutions architect can enable private connectivity between the VPC and DynamoDB. By creating a security group entry in the endpoint's security group to provide access, the solutions architect can control which EC2 instances can communicate with DynamoDB through the endpoint. Reference:
Gateway Endpoints Controlling Access to Services with VPC Endpoints
Question 546

A company has a service that reads and writes large amounts of data from an Amazon S3 bucket in the same AWS Region. The service is deployed on Amazon EC2 instances within the private subnet of a VPC. The service communicates with Amazon S3 over a NAT gateway in the public subnet.
However, the company wants a solution that will reduce the data output costs.
Which solution will meet these requirements MOST cost-effectively?
Explanation:
it allows the company to reduce the data output costs for accessing Amazon S3 from Amazon EC2 instances in a VPC. By provisioning a VPC gateway endpoint, the company can enable private connectivity between the VPC and S3. By configuring the route table for the private subnet to use the gateway endpoint as the route for all S3 traffic, the company can avoid using a NAT gateway, which charges for data processing and data transfer. Reference:
VPC Endpoints for Amazon S3
VPC Endpoints Pricing
Question 547

A company runs multiple Amazon EC2 Linux instances in a VPC across two Availability Zones. The instances host applications that use a hierarchical directory structure. The applications need to read and write rapidly and concurrently to shared storage.
What should a solutions architect do to meet these requirements?
Explanation:
it allows the EC2 instances to read and write rapidly and concurrently to shared storage across two
Availability Zones. Amazon EFS provides a scalable, elastic, and highly available file system that can be mounted from multiple EC2 instances. Amazon EFS supports high levels of throughput and IOPS, and consistent low latencies. Amazon EFS also supports NFSv4 lock upgrading and downgrading, which enables high levels of concurrency. Reference:
Amazon EFS Features
Using Amazon EFS with Amazon EC2
Question 548

A company is using AWS Key Management Service (AWS KMS) keys to encrypt AWS Lambda environment variables. A solutions architect needs to ensure that the required permissions are in place to decrypt and use the environment variables.
Which steps must the solutions architect take to implement the correct permissions? (Choose two.)
Explanation:
B and D are the correct answers because they ensure that the Lambda execution role has the permissions to decrypt and use the environment variables, and that the AWS KMS key policy allows the Lambda execution role to use the key. The Lambda execution role is an IAM role that grants the Lambda function permission to access AWS resources, such as AWS KMS. The AWS KMS key policy is a resource-based policy that controls access to the key. By adding AWS KMS permissions in the Lambda execution role and allowing the Lambda execution role in the AWS KMS key policy, the solutions architect can implement the correct permissions for encrypting and decrypting environment variables. Reference:
AWS Lambda Execution Role
Using AWS KMS keys in AWS Lambda
Question 549

A company wants to use an AWS CloudFormatlon stack for its application in a test environment. The company stores the CloudFormation template in an Amazon S3 bucket that blocks public access. The company wants to grant CloudFormation access to the template in the S3 bucket based on specific user requests to create the test environment The solution must follow security best practices.
Which solution will meet these requirements?
Explanation:
it allows CloudFormation to access the template in the S3 bucket without granting public access or creating additional resources. A presigned URL is a URL that is signed with the access key of an IAM user or role that has permission to access the object. The presigned URL can be used by anyone who receives it, but it expires after a specified time. By creating a presigned URL for the template object and configuring the CloudFormation stack to use it, the company can grant CloudFormation access to the template based on specific user requests and follow security best practices. Reference:
Using Amazon S3 Presigned URLs
Using Amazon S3 Buckets
Question 550

A company runs an application on AWS. The application receives inconsistent amounts of usage. The application uses AWS Direct Connect to connect to an on-premises MySQL-compatible database. The on-premises database consistently uses a minimum of 2 GiB of memory.
The company wants to migrate the on-premises database to a managed AWS service. The company wants to use auto scaling capabilities to manage unexpected workload increases.
Which solution will meet these requirements with the LEAST administrative overhead?
Explanation:
it allows the company to migrate the on-premises database to a managed AWS service that supports auto scaling capabilities and has the least administrative overhead. Amazon Aurora Serverless v2 is a configuration of Amazon Aurora that automatically scales compute capacity based on workload demand. It can scale from hundreds to hundreds of thousands of transactions in a fraction of a second. Amazon Aurora Serverless v2 also supports MySQL-compatible databases and AWS Direct Connect connectivity. Reference:
Amazon Aurora Serverless v2 Connecting to an Amazon Aurora DB Cluster
Question