ExamGecko
Home Home / Amazon / SAA-C03

Amazon SAA-C03 Practice Test - Questions Answers

Question list
Search
Search

List of questions

Search

Related questions











A company recently migrated to AWS and wants to implement a solution to protect the traffic that flows in and out of the production VPC. The company had an inspection server in its on-premises data center. The inspection server performed specific operations such as traffic flow inspection and traffic filtering. The company wants to have the same functionalities in the AWS Cloud.

Which solution will meet these requirements?

A.
Use Amazon GuardDuty for traffic inspection and traffic filtering in the production VPC
A.
Use Amazon GuardDuty for traffic inspection and traffic filtering in the production VPC
Answers
B.
Use Traffic Mirroring to mirror traffic from the production VPC for traffic inspection and filtering.
B.
Use Traffic Mirroring to mirror traffic from the production VPC for traffic inspection and filtering.
Answers
C.
Use AWS Network Firewall to create the required rules for traffic inspection and traffic filtering for the production VPC.
C.
Use AWS Network Firewall to create the required rules for traffic inspection and traffic filtering for the production VPC.
Answers
D.
Use AWS Firewall Manager to create the required rules for traffic inspection and traffic filtering for the production VPC.
D.
Use AWS Firewall Manager to create the required rules for traffic inspection and traffic filtering for the production VPC.
Answers
Suggested answer: C

Explanation:

AWS Network Firewall supports both inspection and filtering as required

A company hosts a data lake on AWS. The data lake consists of data in Amazon S3 and Amazon RDS for PostgreSQL. The company needs a reporting solution that provides data visualization and includes all the data sources within the data lake. Only the company's management team should have full access to all the visualizations. The rest of the company should have only limited access. Which solution will meet these requirements?

A.
Create an analysis in Amazon QuickSight. Connect all the data sources and create new datasets.Publish dashboards to visualize the data. Share the dashboards with the appropriate IAM roles.
A.
Create an analysis in Amazon QuickSight. Connect all the data sources and create new datasets.Publish dashboards to visualize the data. Share the dashboards with the appropriate IAM roles.
Answers
B.
Create an analysis in Amazon OuickSighl. Connect all the data sources and create new datasets.Publish dashboards to visualize the data. Share the dashboards with the appropriate users and groups.
B.
Create an analysis in Amazon OuickSighl. Connect all the data sources and create new datasets.Publish dashboards to visualize the data. Share the dashboards with the appropriate users and groups.
Answers
C.
Create an AWS Glue table and crawler for the data in Amazon S3. Create an AWS Glue extract, transform, and load (ETL) job to produce reports. Publish the reports to Amazon S3. Use S3 bucket policies to limit access to the reports.
C.
Create an AWS Glue table and crawler for the data in Amazon S3. Create an AWS Glue extract, transform, and load (ETL) job to produce reports. Publish the reports to Amazon S3. Use S3 bucket policies to limit access to the reports.
Answers
D.
Create an AWS Glue table and crawler for the data in Amazon S3. Use Amazon Athena Federated Query to access data within Amazon RDS for PoslgreSQL. Generate reports by using Amazon Athena. Publish the reports to Amazon S3. Use S3 bucket policies to limit access to the reports.
D.
Create an AWS Glue table and crawler for the data in Amazon S3. Use Amazon Athena Federated Query to access data within Amazon RDS for PoslgreSQL. Generate reports by using Amazon Athena. Publish the reports to Amazon S3. Use S3 bucket policies to limit access to the reports.
Answers
Suggested answer: B

Explanation:

Amazon QuickSight is a data visualization service that allows you to create interactive dashboards and reports from various data sources, including Amazon S3 and Amazon RDS for PostgreSQL. You can connect all the data sources and create new datasets in QuickSight, and then publish dashboards

A company is implementing a new business application. The application runs on two Amazon EC2 instances and uses an Amazon S3 bucket for document storage. A solutions architect needs to ensure that the EC2 instances can access the S3 bucket.

What should the solutions architect do to meet this requirement?

A.
Create an IAM role that grants access to the S3 bucket. Attach the role to the EC2 instances.
A.
Create an IAM role that grants access to the S3 bucket. Attach the role to the EC2 instances.
Answers
B.
Create an IAM policy that grants access to the S3 bucket. Attach the policy to the EC2 instances.
B.
Create an IAM policy that grants access to the S3 bucket. Attach the policy to the EC2 instances.
Answers
C.
Create an IAM group that grants access to the S3 bucket. Attach the group to the EC2 instances.
C.
Create an IAM group that grants access to the S3 bucket. Attach the group to the EC2 instances.
Answers
D.
Create an IAM user that grants access to the S3 bucket. Attach the user account to the EC2 instances.
D.
Create an IAM user that grants access to the S3 bucket. Attach the user account to the EC2 instances.
Answers
Suggested answer: A

Explanation:

https://aws.amazon.com/premiumsupport/knowledge-center/ec2-instance-access-s3-bucket/

An application development team is designing a microservice that will convert large images to smaller, compressed images. When a user uploads an image through the web interface, the microservice should store the image in an Amazon S3 bucket, process and compress the image with an AWS Lambda function, and store the image in its compressed form in a different S3 bucket. A solutions architect needs to design a solution that uses durable, stateless components to process the images automatically. Which combination of actions will meet these requirements? (Choose two.)

A.
Create an Amazon Simple Queue Service (Amazon SQS) queue Configure the S3 bucket to send a notification to the SQS queue when an image is uploaded to the S3 bucket
A.
Create an Amazon Simple Queue Service (Amazon SQS) queue Configure the S3 bucket to send a notification to the SQS queue when an image is uploaded to the S3 bucket
Answers
B.
Configure the Lambda function to use the Amazon Simple Queue Service (Amazon SQS) queue as the invocation source When the SQS message is successfully processed, delete the message in the queue
B.
Configure the Lambda function to use the Amazon Simple Queue Service (Amazon SQS) queue as the invocation source When the SQS message is successfully processed, delete the message in the queue
Answers
C.
Configure the Lambda function to monitor the S3 bucket for new uploads When an uploaded image is detected write the file name to a text file in memory and use the text file to keep track of the images that were processed
C.
Configure the Lambda function to monitor the S3 bucket for new uploads When an uploaded image is detected write the file name to a text file in memory and use the text file to keep track of the images that were processed
Answers
D.
Launch an Amazon EC2 instance to monitor an Amazon Simple Queue Service (Amazon SQS) queue When items are added to the queue log the file name in a text file on the EC2 instance and invoke the Lambda function
D.
Launch an Amazon EC2 instance to monitor an Amazon Simple Queue Service (Amazon SQS) queue When items are added to the queue log the file name in a text file on the EC2 instance and invoke the Lambda function
Answers
E.
Configure an Amazon EventBridge (Amazon CloudWatch Events) event to monitor the S3 bucket When an image is uploaded send an alert to an Amazon Simple Notification Service (Amazon SNS) topic with the application owner's email address for further processing
E.
Configure an Amazon EventBridge (Amazon CloudWatch Events) event to monitor the S3 bucket When an image is uploaded send an alert to an Amazon Simple Notification Service (Amazon SNS) topic with the application owner's email address for further processing
Answers
Suggested answer: A, B

Explanation:

Creating an Amazon Simple Queue Service (SQS) queue and configuring the S3 bucket to send a notification to the SQS queue when an image is uploaded to the S3 bucket will ensure that the Lambda function is triggered in a stateless and durable manner.Configuring the Lambda function to use the SQS queue as the invocation source, and deleting the message in the queue after it is successfully processed will ensure that the Lambda function processes the image in a stateless and durable manner.Amazon SQS is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. SQS eliminates the complexity and overhead associated with managing and operating-message oriented middleware, and empowers developers to focus on differentiating work. When new images are uploaded to the S3 bucket, SQS will trigger the Lambda function to process the image and compress it. Once the image is processed, the SQS message is deleted, ensuring that the Lambda function is stateless and durable.

A company has a three-tier web application that is deployed on AWS. The web servers are deployed in a public subnet in a VPC. The application servers and database servers are deployed in private subnets in the same VPC. The company has deployed a third-party virtual firewall appliance from AWS Marketplace in an inspection VPC. The appliance is configured with an IP interface that can accept IP packets. A solutions architect needs to Integrate the web application with the appliance to inspect all traffic to the application before the traffic teaches the web server. Which solution will moot these requirements with the LEAST operational overhead?

A.
Create a Network Load Balancer the public subnet of the application's VPC to route the traffic lo the appliance for packet inspection
A.
Create a Network Load Balancer the public subnet of the application's VPC to route the traffic lo the appliance for packet inspection
Answers
B.
Create an Application Load Balancer in the public subnet of the application's VPC to route the traffic to the appliance for packet inspection
B.
Create an Application Load Balancer in the public subnet of the application's VPC to route the traffic to the appliance for packet inspection
Answers
C.
Deploy a transit gateway m the inspection VPC Configure route tables to route the incoming pockets through the transit gateway
C.
Deploy a transit gateway m the inspection VPC Configure route tables to route the incoming pockets through the transit gateway
Answers
D.
Deploy a Gateway Load Balancer in the inspection VPC Create a Gateway Load Balancer endpoint to receive the incoming packets and forward the packets to the appliance
D.
Deploy a Gateway Load Balancer in the inspection VPC Create a Gateway Load Balancer endpoint to receive the incoming packets and forward the packets to the appliance
Answers
Suggested answer: D

Explanation:

https://aws.amazon.com/blogs/networking-and-content-delivery/scaling-network-traffic-inspection- using-aws-gateway-load-balancer/

A company collects temperature, humidity, and atmospheric pressure data in cities across multiple continents. The average volume of data collected per site each day is 500 GB. Each site has a highspeed internet connection. The company's weather forecasting applications are based in a single Region and analyze the data daily.

What is the FASTEST way to aggregate data from all of these global sites?

A.
Enable Amazon S3 Transfer Acceleration on the destination bucket. Use multipart uploads to directly upload site data to the destination bucket.
A.
Enable Amazon S3 Transfer Acceleration on the destination bucket. Use multipart uploads to directly upload site data to the destination bucket.
Answers
B.
Upload site data to an Amazon S3 bucket in the closest AWS Region. Use S3 cross-Region replication to copy objects to the destination bucket.
B.
Upload site data to an Amazon S3 bucket in the closest AWS Region. Use S3 cross-Region replication to copy objects to the destination bucket.
Answers
C.
Schedule AWS Snowball jobs daily to transfer data to the closest AWS Region. Use S3 cross-Region replication to copy objects to the destination bucket.
C.
Schedule AWS Snowball jobs daily to transfer data to the closest AWS Region. Use S3 cross-Region replication to copy objects to the destination bucket.
Answers
D.
Upload the data to an Amazon EC2 instance in the closest Region. Store the data in an Amazon Elastic Block Store (Amazon EBS) volume. Once a day take an EBS snapshot and copy it to the centralized Region. Restore the EBS volume in the centralized Region and run an analysis on the data daily.
D.
Upload the data to an Amazon EC2 instance in the closest Region. Store the data in an Amazon Elastic Block Store (Amazon EBS) volume. Once a day take an EBS snapshot and copy it to the centralized Region. Restore the EBS volume in the centralized Region and run an analysis on the data daily.
Answers
Suggested answer: A

Explanation:

You might want to use Transfer Acceleration on a bucket for various reasons, including the following:

You have customers that upload to a centralized bucket from all over the world.

You transfer gigabytes to terabytes of data on a regular basis across continents.

You are unable to utilize all of your available bandwidth over the Internet when uploading to Amazon S3. https://docs.aws.amazon.com/AmazonS3/latest/dev/transfer-acceleration.html https://aws.amazon.com/s3/transferacceleration/#:~:text=S3%20Transfer%20Acceleration%20(S3TA)%20reduces,to%20S3%20for%20remote%20applications:

"Amazon S3 Transfer Acceleration can speed up content transfers to and from Amazon S3 by as much as 50-500% for long-distance transfer of larger objects. Customers who have either web or mobile applications with widespread users or applications hosted far away from their S3 bucket can experience long and variable upload and download speeds over the Internet" https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html "Improved throughput - You can upload parts in parallel to improve throughput."

A company needs the ability to analyze the log files of its proprietary application. The logs are stored in JSON format in an Amazon S3 bucket Queries will be simple and will run on-demand A solutions architect needs to perform the analysis with minimal changes to the existing architecture What should the solutions architect do to meet these requirements with the LEAST amount of operational overhead?

A.
Use Amazon Redshift to load all the content into one place and run the SQL queries as needed
A.
Use Amazon Redshift to load all the content into one place and run the SQL queries as needed
Answers
B.
Use Amazon CloudWatch Logs to store the logs Run SQL queries as needed from the Amazon CloudWatch console
B.
Use Amazon CloudWatch Logs to store the logs Run SQL queries as needed from the Amazon CloudWatch console
Answers
C.
Use Amazon Athena directly with Amazon S3 to run the queries as needed
C.
Use Amazon Athena directly with Amazon S3 to run the queries as needed
Answers
D.
Use AWS Glue to catalog the logs Use a transient Apache Spark cluster on Amazon EMR to run the SQL queries as needed
D.
Use AWS Glue to catalog the logs Use a transient Apache Spark cluster on Amazon EMR to run the SQL queries as needed
Answers
Suggested answer: C

Explanation:

Amazon Athena can be used to query JSON in S3

A company uses AWS Organizations to manage multiple AWS accounts for different departments.

The management account has an Amazon S3 bucket that contains project reports. The company wants to limit access to this S3 bucket to only users of accounts within the organization in AWS Organizations. Which solution meets these requirements with the LEAST amount of operational overhead?

A.
Add the aws:PrincipalOrgID global condition key with a reference to the organization ID to the S3 bucket policy.
A.
Add the aws:PrincipalOrgID global condition key with a reference to the organization ID to the S3 bucket policy.
Answers
B.
Create an organizational unit (OU) for each department. Add the aws:PrincipalOrgPaths global condition key to the S3 bucket policy.
B.
Create an organizational unit (OU) for each department. Add the aws:PrincipalOrgPaths global condition key to the S3 bucket policy.
Answers
C.
Use AWS CloudTrail to monitor the CreateAccount, InviteAccountToOrganization, LeaveOrganization, and RemoveAccountFromOrganization events. Update the S3 bucket policy accordingly.
C.
Use AWS CloudTrail to monitor the CreateAccount, InviteAccountToOrganization, LeaveOrganization, and RemoveAccountFromOrganization events. Update the S3 bucket policy accordingly.
Answers
D.
Tag each user that needs access to the S3 bucket. Add the aws:PrincipalTag global condition key to the S3 bucket policy.
D.
Tag each user that needs access to the S3 bucket. Add the aws:PrincipalTag global condition key to the S3 bucket policy.
Answers
Suggested answer: A

Explanation:

https://aws.amazon.com/blogs/security/control-access-to-aws-resources-by-using-the-awsorganization- of-iam-principals/ The aws:PrincipalOrgID global key provides an alternative to listing all the account IDs for all AWS accounts in an organization. For example, the following Amazon S3 bucket policy allows members of any account in the XXX organization to add an object into the examtopics bucket.

{"Version": "2020-09-10",

"Statement": {

"Sid": "AllowPutObject",

"Effect": "Allow",

"Principal": "*",

"Action": "s3:PutObject",

"Resource": "arn:aws:s3:::examtopics/*",

"Condition": {"StringEquals":

{"aws:PrincipalOrgID":["XXX"]}}}}

https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html

An application runs on an Amazon EC2 instance in a VPC. The application processes logs that are stored in an Amazon S3 bucket. The EC2 instance needs to access the S3 bucket without connectivity to the internet. Which solution will provide private network connectivity to Amazon S3?

A.
Create a gateway VPC endpoint to the S3 bucket.
A.
Create a gateway VPC endpoint to the S3 bucket.
Answers
B.
Stream the logs to Amazon CloudWatch Logs. Export the logs to the S3 bucket.
B.
Stream the logs to Amazon CloudWatch Logs. Export the logs to the S3 bucket.
Answers
C.
Create an instance profile on Amazon EC2 to allow S3 access.
C.
Create an instance profile on Amazon EC2 to allow S3 access.
Answers
D.
Create an Amazon API Gateway API with a private link to access the S3 endpoint.
D.
Create an Amazon API Gateway API with a private link to access the S3 endpoint.
Answers
Suggested answer: A

Explanation:

VPC endpoint allows you to connect to AWS services using a private network instead of using the public Internet

A company is hosting a web application on AWS using a single Amazon EC2 instance that stores useruploaded documents in an Amazon EBS volume. For better scalability and availability, the company duplicated the architecture and created a second EC2 instance and EBS volume in another Availability Zone placing both behind an Application Load Balancer After completing this change, users reported that, each time they refreshed the website, they could see one subset of their documents or the other, but never all of the documents at the same time.

What should a solutions architect propose to ensure users see all of their documents at once?

A.
Copy the data so both EBS volumes contain all the documents.
A.
Copy the data so both EBS volumes contain all the documents.
Answers
B.
Configure the Application Load Balancer to direct a user to the server with the documents
B.
Configure the Application Load Balancer to direct a user to the server with the documents
Answers
C.
Copy the data from both EBS volumes to Amazon EFS Modify the application to save new documents to Amazon EFS
C.
Copy the data from both EBS volumes to Amazon EFS Modify the application to save new documents to Amazon EFS
Answers
D.
Configure the Application Load Balancer to send the request to both servers Return each document from the correct server.
D.
Configure the Application Load Balancer to send the request to both servers Return each document from the correct server.
Answers
Suggested answer: C

Explanation:


Total 886 questions
Go to page: of 89