ExamGecko
Home Home / Amazon / SAA-C03

Amazon SAA-C03 Practice Test - Questions Answers, Page 10

Question list
Search
Search

List of questions

Search

Related questions











A company has applications that run on Amazon EC2 instances in a VPC. One of the applications needs to call the Amazon S3 API to store and read objects. According to the company's security regulations, no traffic from the applications is allowed to travel across the internet.

Which solution will meet these requirements?

A.
Configure an S3 interface endpoint.
A.
Configure an S3 interface endpoint.
Answers
B.
Configure an S3 gateway endpoint.
B.
Configure an S3 gateway endpoint.
Answers
C.
Create an S3 bucket in a private subnet.
C.
Create an S3 bucket in a private subnet.
Answers
D.
Create an S3 bucket in the same Region as the EC2 instance.
D.
Create an S3 bucket in the same Region as the EC2 instance.
Answers
Suggested answer: A

Explanation:

https://docs.aws.amazon.com/vpc/latest/privatelink/gateway-endpoints.html

A company is storing sensitive user information in an Amazon S3 bucket The company wants to provide secure access to this bucket from the application tier running on Ama2on EC2 instances inside a VPC Which combination of steps should a solutions architect take to accomplish this? (Select TWO.)

A.
Configure a VPC gateway endpoint for Amazon S3 within the VPC
A.
Configure a VPC gateway endpoint for Amazon S3 within the VPC
Answers
B.
Create a bucket policy to make the objects to the S3 bucket public
B.
Create a bucket policy to make the objects to the S3 bucket public
Answers
C.
Create a bucket policy that limits access to only the application tier running in the VPC
C.
Create a bucket policy that limits access to only the application tier running in the VPC
Answers
D.
Create an IAM user with an S3 access policy and copy the IAM credentials to the EC2 instance
D.
Create an IAM user with an S3 access policy and copy the IAM credentials to the EC2 instance
Answers
E.
Create a NAT instance and have the EC2 instances use the NAT instance to access the S3 bucket
E.
Create a NAT instance and have the EC2 instances use the NAT instance to access the S3 bucket
Answers
Suggested answer: A, C

Explanation:

https://aws.amazon.com/premiumsupport/knowledge-center/s3-private-connection-noauthentication/

A company runs an on-premises application that is powered by a MySQL database The company is migrating the application to AWS to Increase the application's elasticity and availability The current architecture shows heavy read activity on the database during times of normal operation Every 4 hours the company's development team pulls a full export of the production database to populate a database in the staging environment During this period, users experience unacceptable application latency The development team is unable to use the staging environment until the procedure completes A solutions architect must recommend replacement architecture that alleviates the application latency issue

The replacement architecture also must give the development team the ability to continue using the staging environment without delay Which solution meets these requirements?

A.
Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for production. Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.
A.
Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for production. Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.
Answers
B.
Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for production Use database cloning to create the staging database on-demand
B.
Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for production Use database cloning to create the staging database on-demand
Answers
C.
Use Amazon RDS for MySQL with a Mufti AZ deployment and read replicas for production Use the standby instance tor the staging database.
C.
Use Amazon RDS for MySQL with a Mufti AZ deployment and read replicas for production Use the standby instance tor the staging database.
Answers
D.
Use Amazon RDS for MySQL with a Multi-AZ deployment and read replicas for production.Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.
D.
Use Amazon RDS for MySQL with a Multi-AZ deployment and read replicas for production.Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.
Answers
Suggested answer: B

A company is designing an application where users upload small files into Amazon S3. After a user uploads a file, the file requires one-time simple processing to transform the data and save the data in JSON format for later analysis. Each file must be processed as quickly as possible after it is uploaded. Demand will vary. On some days, users will upload a high number of files. On other days, users will upload a few files or no files. Which solution meets these requirements with the LEAST operational overhead?

A.
Configure Amazon EMR to read text files from Amazon S3. Run processing scripts to transform the data. Store the resulting JSON file in an Amazon Aurora DB cluster.
A.
Configure Amazon EMR to read text files from Amazon S3. Run processing scripts to transform the data. Store the resulting JSON file in an Amazon Aurora DB cluster.
Answers
B.
Configure Amazon S3 to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queue. Use Amazon EC2 instances to read from the queue and process the data. Store the resulting JSON file in Amazon DynamoDB.
B.
Configure Amazon S3 to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queue. Use Amazon EC2 instances to read from the queue and process the data. Store the resulting JSON file in Amazon DynamoDB.
Answers
C.
Configure Amazon S3 to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queue. Use an AWS Lambda function to read from the queue and process the data. Store the resulting JSON file in Amazon DynamoDB. Most Voted
C.
Configure Amazon S3 to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queue. Use an AWS Lambda function to read from the queue and process the data. Store the resulting JSON file in Amazon DynamoDB. Most Voted
Answers
D.
Configure Amazon EventBridge (Amazon CloudWatch Events) to send an event to Amazon Kinesis Data Streams when a new file is uploaded. Use an AWS Lambda function to consume the event from the stream and process the data. Store the resulting JSON file in Amazon Aurora DB cluster.
D.
Configure Amazon EventBridge (Amazon CloudWatch Events) to send an event to Amazon Kinesis Data Streams when a new file is uploaded. Use an AWS Lambda function to consume the event from the stream and process the data. Store the resulting JSON file in Amazon Aurora DB cluster.
Answers
Suggested answer: C

Explanation:

Amazon S3 sends event notifications about S3 buckets (for example, object created, object removed, or object restored) to an SNS topic in the same Region. The SNS topic publishes the event to an SQS queue in the central Region.

The SQS queue is configured as the event source for your Lambda function and buffers the event messages for the Lambda function. The Lambda function polls the SQS queue for messages and processes the Amazon S3 event notifications according to your application’s requirements. https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/subscribe-a-lambda-functionto- event-notifications-from-s3-buckets-in-different-aws-regions.html

An application allows users at a company's headquarters to access product dat a. The product data is stored in an Amazon RDS MySQL DB instance. The operations team has isolated an application performance slowdown and wants to separate read traffic from write traffic.

A solutions architect needs to optimize the application's performance quickly.

What should the solutions architect recommend?

A.
Change the existing database to a Multi-AZ deployment. Serve the read requests from the primary Availability Zone.
A.
Change the existing database to a Multi-AZ deployment. Serve the read requests from the primary Availability Zone.
Answers
B.
Change the existing database to a Multi-AZ deployment. Serve the read requests from the secondary Availability Zone.
B.
Change the existing database to a Multi-AZ deployment. Serve the read requests from the secondary Availability Zone.
Answers
C.
Create read replicas for the database. Configure the read replicas with half of the compute and storage resources as the source database.
C.
Create read replicas for the database. Configure the read replicas with half of the compute and storage resources as the source database.
Answers
D.
Create read replicas for the database. Configure the read replicas with the same compute and storage resources as the source database.
D.
Create read replicas for the database. Configure the read replicas with the same compute and storage resources as the source database.
Answers
Suggested answer: D

Explanation:

https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_MySQL.Replication.ReadReplica s.html

An Amazon EC2 administrator created the following policy associated with an IAM group containing several users

What is the effect of this policy?

A.
Users can terminate an EC2 instance in any AWS Region except us-east-1.
A.
Users can terminate an EC2 instance in any AWS Region except us-east-1.
Answers
B.
Users can terminate an EC2 instance with the IP address 10 100 100 1 in the us-east-1 Region
B.
Users can terminate an EC2 instance with the IP address 10 100 100 1 in the us-east-1 Region
Answers
C.
Users can terminate an EC2 instance in the us-east-1 Region when the user's source IP is 10.100.100.254.
C.
Users can terminate an EC2 instance in the us-east-1 Region when the user's source IP is 10.100.100.254.
Answers
D.
Users cannot terminate an EC2 instance in the us-east-1 Region when the user's source IP is 10.100 100 254
D.
Users cannot terminate an EC2 instance in the us-east-1 Region when the user's source IP is 10.100 100 254
Answers
Suggested answer: C

Explanation:

Explanation: as the policy prevents anyone from doing any EC2 action on any region except us-east-1 and allows only users with source ip 10.100.100.0/24 to terminate instances. So user with source ip 10.100.100.254 can terminate instances in us-east-1 region.

A company has a large Microsoft SharePoint deployment running on-premises that requires Microsoft Windows shared file storage. The company wants to migrate this workload to the AWS Cloud and is considering various storage options. The storage solution must be highly available and integrated with Active Directory for access control. Which solution will satisfy these requirements?

A.
Configure Amazon EFS storage and set the Active Directory domain for authentication
A.
Configure Amazon EFS storage and set the Active Directory domain for authentication
Answers
B.
Create an SMB Me share on an AWS Storage Gateway tile gateway in two Availability Zones
B.
Create an SMB Me share on an AWS Storage Gateway tile gateway in two Availability Zones
Answers
C.
Create an Amazon S3 bucket and configure Microsoft Windows Server to mount it as a volume
C.
Create an Amazon S3 bucket and configure Microsoft Windows Server to mount it as a volume
Answers
D.
Create an Amazon FSx for Windows File Server file system on AWS and set the Active Directory domain for authentication
D.
Create an Amazon FSx for Windows File Server file system on AWS and set the Active Directory domain for authentication
Answers
Suggested answer: D

An image-processing company has a web application that users use to upload images. The application uploads the images into an Amazon S3 bucket. The company has set up S3 event notifications to publish the object creation events to an Amazon Simple Queue Service (Amazon SQS) standard queue. The SQS queue serves as the event source for an AWS Lambda function that processes the images and sends the results to users through email. Users report that they are receiving multiple email messages for every uploaded image. A solutions architect determines that SQS messages are invoking the Lambda function more than once, resulting in multiple email messages. What should the solutions architect do to resolve this issue with the LEAST operational overhead?

A.
Set up long polling in the SQS queue by increasing the ReceiveMessage wait time to 30 seconds.
A.
Set up long polling in the SQS queue by increasing the ReceiveMessage wait time to 30 seconds.
Answers
B.
Change the SQS standard queue to an SQS FIFO queue. Use the message deduplication ID to discard duplicate messages.
B.
Change the SQS standard queue to an SQS FIFO queue. Use the message deduplication ID to discard duplicate messages.
Answers
C.
Increase the visibility timeout in the SQS queue to a value that is greater than the total of the function timeout and the batch window timeout.
C.
Increase the visibility timeout in the SQS queue to a value that is greater than the total of the function timeout and the batch window timeout.
Answers
D.
Modify the Lambda function to delete each message from the SQS queue immediately after the message is read before processing.
D.
Modify the Lambda function to delete each message from the SQS queue immediately after the message is read before processing.
Answers
Suggested answer: C

A company is implementing a shared storage solution for a media application that is hosted m the AWS Cloud The company needs the ability to use SMB clients to access data The solution must he fully managed. Which AWS solution meets these requirements?

A.
Create an AWS Storage Gateway volume gateway. Create a file share that uses the required client protocol Connect the application server to the tile share.
A.
Create an AWS Storage Gateway volume gateway. Create a file share that uses the required client protocol Connect the application server to the tile share.
Answers
B.
Create an AWS Storage Gateway tape gateway Configure (apes to use Amazon S3 Connect the application server lo the tape gateway
B.
Create an AWS Storage Gateway tape gateway Configure (apes to use Amazon S3 Connect the application server lo the tape gateway
Answers
C.
Create an Amazon EC2 Windows instance Install and configure a Windows file share role on the instance. Connect the application server to the file share.
C.
Create an Amazon EC2 Windows instance Install and configure a Windows file share role on the instance. Connect the application server to the file share.
Answers
D.
Create an Amazon FSx for Windows File Server tile system Attach the fie system to the origin server. Connect the application server to the tile system
D.
Create an Amazon FSx for Windows File Server tile system Attach the fie system to the origin server. Connect the application server to the tile system
Answers
Suggested answer: D

Explanation:

Amazon FSx has native support for Windows file system features and for the industry-standard Server Message Block (SMB) protocol to access file storage over a network.https://docs.aws.amazon.com/fsx/latest/WindowsGuide/what- is.html

A company's containerized application runs on an Amazon EC2 instance. The application needs to download security certificates before it can communicate with other business applications. The company wants a highly secure solution to encrypt and decrypt the certificates in near real time. The solution also needs to store data in highly available storage after the data is encrypted. Which solution will meet these requirements with the LEAST operational overhead?

A.
Create AWS Secrets Manager secrets for encrypted certificates. Manually update the certificates as needed. Control access to the data by using fine-grained IAM access.
A.
Create AWS Secrets Manager secrets for encrypted certificates. Manually update the certificates as needed. Control access to the data by using fine-grained IAM access.
Answers
B.
Create an AWS Lambda function that uses the Python cryptography library to receive and perform encryption operations. Store the function in an Amazon S3 bucket.
B.
Create an AWS Lambda function that uses the Python cryptography library to receive and perform encryption operations. Store the function in an Amazon S3 bucket.
Answers
C.
Create an AWS Key Management Service (AWS KMS) customer managed key. Allow the EC2 role to use the KMS key for encryption operations. Store the encrypted data on Amazon S3.
C.
Create an AWS Key Management Service (AWS KMS) customer managed key. Allow the EC2 role to use the KMS key for encryption operations. Store the encrypted data on Amazon S3.
Answers
D.
Create an AWS Key Management Service (AWS KMS) customer managed key. Allow the EC2 role to use the KMS key for encryption operations. Store the encrypted data on Amazon Elastic Block Store (Amazon EBS) volumes.
D.
Create an AWS Key Management Service (AWS KMS) customer managed key. Allow the EC2 role to use the KMS key for encryption operations. Store the encrypted data on Amazon Elastic Block Store (Amazon EBS) volumes.
Answers
Suggested answer: C
Total 886 questions
Go to page: of 89